class gdgps_apps.apps.APPS(settings_file='~/.apps_settings', portal='', api_path='api/user', label=None, key=None, secret=None, download_directory=None, trust_env=False, log_level=None)[source]

Establish a session with the APPS servers.

The APPS class is the main interface for interaction with APPS. It greatly simplifies usage of APPS by automatically handling tasks such as data caching to speed up and reduce the amount of back and forth communication between your local machine and the APPS servers. APPS is comprised of more than a single server which adds complexity to interfacing with it. The portal server is the webserver that your account settings and credentials are housed and accessed from and there are several APPS processing servers that data is uploaded to and downloaded from. This class abstracts away the complexity of interfacing with APPS by automatically selecting the current fastest APPS processing server to upload data to and by keeping track of where your data is located and how to retrieve it.

To use the APPS class you will need your apps_settings file which contains your authorization credential. The APPS library will sign all your API requests with this credential to keep your data private and secure. Be sure to keep your settings file in a safe, secret place! If at any time you think your account has been compromised you can reset your credential.

  • settings_file – Path to the settings file containing your authorization credentials.

  • portal – URL of the APPs portal to use. Most likely you will never need to override the default. Supplying this parameter will override the parameter from your settings file.

  • api_path – The path of the API to use. Supplying this parameter will override the parameter from your settings file.

  • key – Manually supply your API credential’s key. This will override the parameter from your settings file.

  • secret – Manually supply your API credential’s secret. This will override the parameter from your settings file.

  • download_directory – The directory that any files downloaded from apps will be stored in. By default this is the current working directory

  • trust_env – Trust environment settings for proxy configuration, default authentication and similar.

  • log_level – Set the logging level, OFF is silent and DEBUG is the highest verbosity. Available levels are {OFF,DEBUG,INFO,WARNING,ERROR,CRITICAL}. If log_level is None, no logging configuration will be performed. (see Logging and Output and gdgps_apps.apps.APPS.set_log_level()).

MUTABLE_CREDENTIALS = ['is_staff', 'is_superuser']

Approve the given data item for processing. This will only have an effect for data in the Verified state.


dataid – The UUID of the data item to approve


static base_url(url)[source]

Parse out the root path of any given URL.


url – a valid url string


The root url path, or None if the input was not able to be parsed

default_settings_path = '~/.apps_settings'

Delete the specified source file from the APPS servers.


selector – Either a source identifier, or a 2-tuple where the first element is the data identifier and the second element is either None (for all sources), the source type or the name of the source.


A list containing the sources that were deleted, if any.

detail(uuid, query=None)[source]
property download_directory

The str containing the path of the directory where files downloaded from APPS will be stored. If the path does not exist, the directories will be recursively created. If set to None, the current working directory will be used.


InvalidConfiguration – if the path does not exist and was not able to be created or exists but is not a directory

download_result(uuid, result_type=None, stream=True, write_to_disk=True, dr=None, query=None)[source]

Download the results of a given submission.

  • uuid – The identifier of the data submission to download results for.

  • result_type – Optional, the type of result to get. If None or unspecified all results will be downloaded as in an archive.

  • stream

  • write_to_disk

  • dr – The directory to write to if different that the global APPS configured download directory.

  • query


download_run(uuid, stream=True, write_to_disk=True, dr=None, query=None)[source]
download_source(uuid, source_type=None, stream=True, write_to_disk=True, dr=None, query=None)[source]

Check if data of a given name or UUID exists.


identifier – Either a name or UUID of a given Data item


True if data of the given name or UUID exists, False otherwise

flags(dataid=None, query=None)[source]
static format_size(byts)[source]

Given a number of bytes, return the most compact representation of it in SI units. The scale factor used is 1024 not 1000. For example:

>>> from gdgps_apps.apps import APPS
>>> APPS.format_size(12312312)

byts – The number of bytes to format, any type convertible to a float is accepted


the scaled str representation of the bytes with the with the appropriate SI symbol


ValueError – If input is not float or convertible to a float

get_data_parameter(uuid, param, force_refresh=False)[source]
get_processor(location, link=None)[source]

Get the client associated with the processor at the specified location or build one from the given link.

  • location – The label of the processor

  • link – A URL at which the processor can be found - will only be used if the label of the processor cant be found.


The client associated with the processor or None if one could not be found or constructed from a link


Get the client associated with the processor node that the given data resides on.


uuid – The identifier of the data submission in question


The client associated with the data’s processor or None if one could not be found

get_source(sid, query=None)[source]

Fetch the data source object given its UUID. :param sid: The UUID of the data submission’s source object. :param query: Additional query parameters (if supported) :return: A dictionary representation of the data source. TODO see webservice API doc


Fetch the UUID of data that has the given name. This method will optimistically search the local cache first and if that fails it will query APPS.


name – A str containing the name of the data to search for.


The UUID of the data with the given name as a str, or None if data of the given name does not exist.

static is_alert_level(lvl)[source]

Is the given string a valid APPS Alert level. This method will recognize short and long forms and is case insensitive.


lvl – a string that might represent an Alert level


The normalized form Alert level string or None if it is not a valid Alert level.

static is_flag_level(lvl)[source]

Is the given string a valid APPS DataFlag level. This method will recognize short and long forms and is case insensitive.


lvl – a string that might represent an DataFlag level


The normalized form DataFlag level string or None if it is not a valid DataFlag level.

static is_id(uuid)[source]

Is the given string a valid data UUID. This method is case insensitive, but canonically in APPS UUID_s are lower case.


uuid – a string that might represent a UUID


The normalized lower case form of the UUID or None if it is not a valid UUID.

property is_staff
is_staff_ = False
static is_state(state)[source]

Is the given string a valid data state representation. This method will recognize short and long forms and is case insensitive.


state – a string that might represent a state


The normalized short form state string represented, or None of the input string was not a valid Data state

property is_superuser
is_superuser_ = False

List the processors that are currently online and available to the user as well as their respective loads


query – Additional query parameters


A list containing dictionaries for each available processor. For example:

    'label': 'pppx1la',  # the label of the node
    'load': ,
    'active': True,

Check connectivity to all APPS servers. This will also refresh the known loads of the APPS processing servers your account is allowed to use. Its not typically necessary to explicity call this function, other functions that need to refresh this information will usually do so under the covers.


A tuple containing either ( True, response ) or ( False, None ) if your APPS portal was reachable. The response will be the json string response from the portal containing load data

profile_cache = None
static read_settings(settings_file='~/.apps_settings')[source]
static set_log_level(level, **kwargs)[source]

This will setup a basic logging config while setting the root logging level to the given level. (see Logging and Output)

  • level – The root logging level, this argument can be either the integer logger level, i.e. logging.DEBUG or a string representation of the log level i.e. ‘DEBUG’. This function accepts one additional logger level above what is provided through the Python logging framework, ‘OFF’ which is equivalent to logging.CRITICAL + 1. If level is None no logging configuration will be done.

  • kwargs – Additional arguments to pass to logging.basicConfig



stream_chunk_size = 8128

The user credential houses some information that might change, to avoid having users re-download it, for certain updates we can just flash updates to disk from their updated profile.

update_data(uuid, **kwargs)[source]
upload_gipsyx(file, pressure=None, attitude=None, stream=False, processor=None, progress_hook=None, **kwargs)[source]

Upload GNSS data for processing by the GipsyX software suite. It is recommended that you compress your data before upload. Even so, some data is quite large and a progress hook can be supplied to facilitate progression information to the user.

For instance, TQDM is a popular terminal based progress bar library. An example of calling this function with a TQDM progress bar you could do something like:

from tqdm import tqdm
with tqdm(
        postfix= {'file': os.path.basename(file)}
) as progress_bar:
    ret = apps.upload_gipsyx(
        progress_hook=lambda bytes_read: progress_bar.update(bytes_read),


If a progress_hook is supplied the call will automatically set stream to True

If interactive user feedback is not desired you may simply upload data like so:

from pprint import pprint
response = apps.upload_gipsyx(

upload_id = response['id']

The above code would upload a RINEX file for kinematic processing and use NCEP data to model atmospheric pressure. It then pulls the data UUID from the response json and uses it to fetch and pretty print complete details about the submission.

  • file – A RINEX file, or a tar file containing a RINEX file and any ancillary source files, such as pressure files or attitude quaternion files. Files may be compressed using bz2, gzip, zip, lzma or Unix compression.

  • pressure – The path to the pressure file to upload, if there is one.

  • attitude – The path to the attitude quaternion file, if there is one. Only helpful for kinematic processing.

  • stream – True if the upload should be streamed, False otherwise. If streaming is enabled the return value of this function will be a requests response object, otherwise the json response of the server will be returned.

  • processor – The label of the processor to use. You should normally leave this blank, APPS will automatically select the least loaded processor to send your data to.

  • progress_hook – A callable that exceptions one argument, which is the number of bytes read since the last

  • kwargs – Any additional parameters accepted by the APPS webservice for configuring the processing of GNSS data. For instance, to set processing mode to Kinematic you would supply processing_mode=apps.defines.GIPSYXData.KINEMATIC


If not streaming, a json response body from the APPS webservice, if streaming a requests library response object.

upload_source(dataid, file, source_type=None, stream=False, progress_hook=None, **kwargs)[source]

Upload a file to the specified data submission as an additional source file. Note the data must be in a mutable state for this operation to work. (i.e. pre-processed and not currently verifying). All upload calls provide hooks that enable feedback to users regarding upload progress. TQDM is a popular python library for providing in-terminal progress bars. For example, one might provide a progress bar using TQDM and this function like so:

from tqdm import tqdm
with tqdm(
        postfix={'file': os.path.basename(pressurefile)}
) as progress_bar:
        progress_hook=lambda bytes_read: progress_bar.update(bytes_read)
  • dataid – The UUID of the data submission to attach this source file to

  • file – The path to the source file

  • source_type – Optionally tell APPS what type of source file this is.

  • stream – If true, stream the upload. This avoids loading the entire file into memory.

  • progress_hook – The progress hook can be any callable that takes the number of bytes read. Its primary use is to enable status updates for the upload. Note, passing a progress_hook will force the upload into streaming mode.

  • kwargs – Additional parameters to pass to APPS for source creation. See the webservices API.