Unverified Commit 06a5e042 authored by Lia Domide's avatar Lia Domide Committed by GitHub

Merge pull request #27 from the-virtual-brain/tvb-rest

Add REST API 
parents 3a77e83f 840c0f96
include LICENSE
include tvb/config/logger/*.conf
recursive-include tvb *
recursive-exclude tvb *.pyc
\ No newline at end of file
recursive-exclude tvb *.pyc
prune tvb/interfaces/rest/client
\ No newline at end of file
include LICENSE
include tvb/config/logger/*.conf
recursive-include tvb *
recursive-exclude tvb *.pyc
prune tvb/interfaces/rest/server
prune tvb/interfaces/web
prune tvb/interfaces/command
prune tvb/tests
\ No newline at end of file
TVB REST client
===============
The tvb-rest-client is a helper package built with the intention to simplify the client interaction with TVB REST API.
All the logic necessary to prepare and send requests towards the REST server, is embeded under a client API.
**GET** requests are sent from this python client using the **requests** library.
For the **POST** requests, a client has to attach a file with some input configuration.
Such a file is usually an **H5** in TVB specific format.
Thus, tvb-rest-client has all the logic for preparing those H5 files and sending requests.
Usage
=====
You should provide the URL towards the TVB REST server.
For the following example, we will suppose TVB REST server runs on *http://localhost:9090*
Accessing the client API entrypoint:
------------------------------------
If the TVB REST server you want to access runs at another address, change the parameter in the bellow TVBClient instantiation.
.. code-block:: python
from tvb.interfaces.rest.client.tvb_client import TVBClient
main_client = TVBClient("http://localhost:9090")
..
Start using the client API to send requests, by calling different types of methods:
- methods that return a list of DTOs
.. code-block:: python
list_of_users = main_client.get_users()
list_of_user_projects = main_client.get_project_list(list_of_users[0].username)
list_of_datatypes_in_project = main_client.get_data_in_project(list_of_user_projects[0].gid)
list_of_operations_for_datatype = main_client.get_operations_for_datatype(list_of_datatypes_in_project[0].gid)
..
- methdos that download data files locally, under a folder chosen by the client
.. code-block:: python
main_client.retrieve_datatype(list_of_datatypes_in_project[0].gid, download_folder)
..
- methods that launch operations in the TVB server
Such an operation requires the client to prepare the operation configuration and send it in an H5 file together with the requests.
By using the client API, the user only needs to instantiate the proper Model class and send it as argument to the following method.
It wraps the serialization of the Model inside the H5 and the attaching to the POST request.
The example above launches a Fourier analyzer, we suppose the Fourier AlgorithmDTO is *list_of_operations_for_datatype[0]*.
.. code-block:: python
project_gid = list_of_user_projects[0].gid
operation_module = list_of_operations_for_datatype[0].module
operation_class = list_of_operations_for_datatype[0].classname
model = FFTAdapterModel()
# logic to fill the model with required attributes
operation_gid = main_client.launch_operation(project_gid, operation_module, operation_class, model)
..
- method to monitor the status of an operation
.. code-block:: python
status = main_client.get_operation_status(operation_gid)
..
\ No newline at end of file
......@@ -40,25 +40,26 @@ import os
import shutil
import setuptools
VERSION = "2.0a1"
VERSION = "2.0a0"
TVB_TEAM = "Mihai Andrei, Lia Domide, Stuart Knock, Bogdan Neacsa, Paula Popa, Paula Sansz Leon, Marmaduke Woodman"
TVB_TEAM = "Mihai Andrei, Lia Domide, Stuart Knock, Bogdan Neacsa, Paula Sansz Leon, Marmaduke Woodman"
TVB_INSTALL_REQUIREMENTS = ["allensdk", "BeautifulSoup4", "cherrypy", "formencode", "Jinja2",
"h5py", "networkx", "nibabel", "numpy", "Pillow", "psutil", "scipy",
TVB_INSTALL_REQUIREMENTS = ["allensdk", "BeautifulSoup4", "cherrypy", "flask", "flask-restplus", "formencode",
"gevent", "h5py", "Jinja2", "networkx", "nibabel", "numpy", "Pillow", "psutil", "scipy",
"simplejson", "sqlalchemy", "sqlalchemy-migrate", "tvb-data", "tvb-gdist", "tvb-library"]
# Packaging tvb-framework with REST server inside
with open(os.path.join(os.path.dirname(__file__), 'README.rst')) as fd:
DESCRIPTION = fd.read()
setuptools.setup(name="tvb-framework",
version=VERSION,
packages=setuptools.find_packages(),
packages=setuptools.find_packages(
exclude=['tvb.interfaces.rest.client', 'tvb.interfaces.rest.client.*']),
include_package_data=True,
install_requires=TVB_INSTALL_REQUIREMENTS,
extras_require={'postgres': ["psycopg2"],
'test': ["pytest", "pytest-benchmark"]},
'test': ["pytest", "pytest-benchmark", "pytest-mock"]},
description='A package for performing whole brain simulations',
long_description=DESCRIPTION,
license="GPL v3",
......
# -*- coding: utf-8 -*-
#
#
# TheVirtualBrain-Framework Package. This package holds all Data Management, and
# Web-UI helpful to run brain-simulations. To use it, you also need do download
# TheVirtualBrain-Scientific Package (for simulators). See content of the
# documentation-folder for more details. See also http://www.thevirtualbrain.org
#
# (c) 2012-2020, Baycrest Centre for Geriatric Care ("Baycrest") and others
#
# This program is free software: you can redistribute it and/or modify it under the
# terms of the GNU General Public License as published by the Free Software Foundation,
# either version 3 of the License, or (at your option) any later version.
# This program is distributed in the hope that it will be useful, but WITHOUT ANY
# WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A
# PARTICULAR PURPOSE. See the GNU General Public License for more details.
# You should have received a copy of the GNU General Public License along with this
# program. If not, see <http://www.gnu.org/licenses/>.
#
#
# CITATION:
# When using The Virtual Brain for scientific publications, please cite it as follows:
#
# Paula Sanz Leon, Stuart A. Knock, M. Marmaduke Woodman, Lia Domide,
# Jochen Mersmann, Anthony R. McIntosh, Viktor Jirsa (2013)
# The Virtual Brain: a simulator of primate brain network dynamics.
# Frontiers in Neuroinformatics (7:10. doi: 10.3389/fninf.2013.00010)
#
#
"""
This is used to package the tvb-rest-client separately.
"""
import os
import shutil
import setuptools
from setuptools.command.egg_info import manifest_maker
manifest_maker.template = 'MANIFEST_rest_client.in'
VERSION = "2.0a1"
TVB_TEAM = "Lia Domide, Paula Popa, Bogdan Valean, Robert Vincze"
TVB_INSTALL_REQUIREMENTS = ["allensdk", "h5py", "networkx", "nibabel", "numpy", "Pillow", "psutil", "requests", "scipy",
"simplejson", "sqlalchemy", "sqlalchemy-migrate", "tvb-data", "tvb-gdist", "tvb-library"]
# Packaging tvb-rest-client
with open(os.path.join(os.path.dirname(__file__), 'README_rest_client.rst')) as fd:
DESCRIPTION = fd.read()
setuptools.setup(name="tvb-rest-client",
version=VERSION,
packages=setuptools.find_packages(
exclude=['tvb.interfaces.web', 'tvb.interfaces.web.*', 'tvb.interfaces.command',
'tvb.interfaces.command.*', 'tvb.tests', 'tvb.tests.*']),
include_package_data=True,
install_requires=TVB_INSTALL_REQUIREMENTS,
extras_require={'postgres': ["psycopg2"],
'test': ["pytest", "pytest-benchmark"]},
description='A helper package for preparing and sending requests towards the TVB REST API',
long_description=DESCRIPTION,
license="GPL v3",
author=TVB_TEAM,
author_email='tvb.admin@thevirtualbrain.org',
url='http://www.thevirtualbrain.org',
download_url='https://github.com/the-virtual-brain/tvb-framework',
keywords='tvb rest client brain simulator neuroscience human animal neuronal dynamics models delay')
# Clean after install
shutil.rmtree('tvb_rest_client.egg-info', True)
......@@ -31,6 +31,7 @@
import os
from abc import abstractmethod
from tvb.adapters.analyzers.matlab_worker import MatlabWorker
from tvb.basic.neotraits.api import Attr
from tvb.basic.profile import TvbProfile
from tvb.core.adapters.abcadapter import ABCAsynchronous, ABCAdapterForm
from tvb.core.entities.filters.chain import FilterChain
......@@ -38,8 +39,9 @@ from tvb.adapters.datatypes.db.connectivity import ConnectivityIndex
from tvb.adapters.datatypes.db.graph import ConnectivityMeasureIndex
from tvb.adapters.datatypes.db.mapped_value import ValueWrapperIndex
from tvb.core.entities.model.model_operation import AlgorithmTransientGroup
from tvb.core.neotraits.forms import DataTypeSelectField
from tvb.core.neotraits.forms import TraitDataTypeSelectField
from tvb.core.utils import extract_matlab_doc_string
from tvb.datatypes.connectivity import Connectivity
from tvb.datatypes.graph import ConnectivityMeasure
BCT_GROUP_MODULARITY = AlgorithmTransientGroup("Modularity Algorithms", "Brain Connectivity Toolbox", "bct")
......@@ -62,9 +64,9 @@ def bct_description(mat_file_name):
class BaseBCTForm(ABCAdapterForm):
def __init__(self, prefix='', project_id=None, draw_ranges=True):
super(BaseBCTForm, self).__init__(prefix, project_id, draw_ranges)
self.connectivity = DataTypeSelectField(self.get_required_datatype(), self, name="connectivity",
required=True, label=self.get_connectivity_label(),
conditions=self.get_filters(), has_all_option=True)
self.connectivity = TraitDataTypeSelectField(Attr(field_type=Connectivity, label=self.get_connectivity_label()),
self, name="connectivity", conditions=self.get_filters(),
has_all_option=True)
@staticmethod
def get_required_datatype():
......
......@@ -43,7 +43,6 @@ from scipy import linalg
from scipy.spatial.distance import pdist
from sklearn.cluster import DBSCAN
from sklearn.manifold import SpectralEmbedding
from tvb.basic.logger.builder import get_logger
from tvb.basic.neotraits.api import HasTraits, Attr, Float
from tvb.basic.neotraits.info import narray_describe
from tvb.core.adapters.abcadapter import ABCAsynchronous, ABCAdapterForm
......@@ -54,14 +53,13 @@ from tvb.core.entities.filters.chain import FilterChain
from tvb.adapters.datatypes.db.fcd import FcdIndex
from tvb.adapters.datatypes.db.graph import ConnectivityMeasureIndex
from tvb.adapters.datatypes.db.time_series import TimeSeriesRegionIndex
from tvb.core.neotraits.forms import DataTypeSelectField, ScalarField
from tvb.core.neotraits.forms import ScalarField, TraitDataTypeSelectField
from tvb.core.neocom import h5
from tvb.core.neotraits.view_model import ViewModel, DataTypeGidAttr
from tvb.datatypes.fcd import Fcd
from tvb.datatypes.graph import ConnectivityMeasure
from tvb.datatypes.time_series import TimeSeriesRegion
LOG = get_logger(__name__)
class FcdCalculator(HasTraits):
"""
......@@ -92,15 +90,26 @@ class FcdCalculator(HasTraits):
between FC(ti) and FC(tj) arranged in a vector""")
class FCDAdapterModel(ViewModel, FcdCalculator):
time_series = DataTypeGidAttr(
linked_datatype=TimeSeriesRegion,
label="Time Series",
required=True,
doc="""The time-series for which the fcd matrices are calculated."""
)
class FCDAdapterForm(ABCAdapterForm):
def __init__(self, prefix='', project_id=None):
super(FCDAdapterForm, self).__init__(prefix, project_id)
self.time_series = DataTypeSelectField(self.get_required_datatype(), self, name=self.get_input_name(),
required=True, label=FcdCalculator.time_series.label,
doc=FcdCalculator.time_series.doc, conditions=self.get_filters(),
has_all_option=True)
self.sw = ScalarField(FcdCalculator.sw, self)
self.sp = ScalarField(FcdCalculator.sp, self)
self.time_series = TraitDataTypeSelectField(FCDAdapterModel.time_series, self, name=self.get_input_name(),
conditions=self.get_filters(), has_all_option=True)
self.sw = ScalarField(FCDAdapterModel.sw, self)
self.sp = ScalarField(FCDAdapterModel.sp, self)
@staticmethod
def get_view_model():
return FCDAdapterModel
@staticmethod
def get_required_datatype():
......@@ -161,7 +170,8 @@ class FunctionalConnectivityDynamicsAdapter(ABCAsynchronous):
def get_output(self):
return [FcdIndex, ConnectivityMeasureIndex]
def configure(self, time_series, sw, sp):
def configure(self, view_model):
# type: (FCDAdapterModel) -> None
"""
Store the input shape to be later used to estimate memory usage. Also create the algorithm instance.
......@@ -172,14 +182,14 @@ class FunctionalConnectivityDynamicsAdapter(ABCAsynchronous):
"""
Store the input shape to be later used to estimate memory usage. Also create the algorithm instance.
"""
self.input_time_series_index = time_series
self.input_time_series_index = self.load_entity_by_gid(view_model.time_series.hex)
self.input_shape = (self.input_time_series_index.data_length_1d,
self.input_time_series_index.data_length_2d,
self.input_time_series_index.data_length_3d,
self.input_time_series_index.data_length_4d)
LOG.debug("time_series shape is %s" % str(self.input_shape))
self.actual_sp = float(sp) / time_series.sample_period
self.actual_sw = float(sw) / time_series.sample_period
self.log.debug("time_series shape is %s" % str(self.input_shape))
self.actual_sp = float(view_model.sp) / self.input_time_series_index.sample_period
self.actual_sw = float(view_model.sw) / self.input_time_series_index.sample_period
actual_ts_length = self.input_shape[0]
if self.actual_sw >= actual_ts_length or self.actual_sp >= actual_ts_length or self.actual_sp >= self.actual_sw:
......@@ -188,11 +198,13 @@ class FunctionalConnectivityDynamicsAdapter(ABCAsynchronous):
"and Sp < Sw. After calibration with sampling period, current values are: Sp=%d, Sw=%d, Ts=%d). "
"Please configure valid input parameters." % (self.actual_sp, self.actual_sw, actual_ts_length))
def get_required_memory_size(self, **kwargs):
def get_required_memory_size(self, view_model):
# type: (FCDAdapterModel) -> int
# We do not know how much memory is needed.
return -1
def get_required_disk_size(self, **kwargs):
def get_required_disk_size(self, view_model):
# type: (FCDAdapterModel) -> int
return 0
@staticmethod
......@@ -214,7 +226,8 @@ class FunctionalConnectivityDynamicsAdapter(ABCAsynchronous):
fcd_h5.labels_ordering.store(json.dumps(Fcd.labels_ordering.default))
return fcd_h5.array_data.get_cached_metadata()
def launch(self, time_series, sw, sp):
def launch(self, view_model):
# type: (FCDAdapterModel) -> [FcdIndex]
"""
Launch algorithm and build results.
......@@ -234,8 +247,9 @@ class FunctionalConnectivityDynamicsAdapter(ABCAsynchronous):
fcd_index = FcdIndex()
fcd_h5_path = h5.path_for(self.storage_path, FcdH5, fcd_index.gid)
with FcdH5(fcd_h5_path) as fcd_h5:
fcd_array_metadata = self._populate_fcd_h5(fcd_h5, fcd, fcd_index.gid, time_series.gid, sw, sp)
self._populate_fcd_index(fcd_index, time_series.gid, fcd, fcd_array_metadata)
fcd_array_metadata = self._populate_fcd_h5(fcd_h5, fcd, fcd_index.gid, self.input_time_series_index.gid,
view_model.sw, view_model.sp)
self._populate_fcd_index(fcd_index, self.input_time_series_index.gid, fcd, fcd_array_metadata)
result.append(fcd_index)
if np.amax(fcd_segmented) == 1.1:
......@@ -243,8 +257,11 @@ class FunctionalConnectivityDynamicsAdapter(ABCAsynchronous):
result_fcd_segmented_h5_path = h5.path_for(self.storage_path, FcdH5, result_fcd_segmented_index.gid)
with FcdH5(result_fcd_segmented_h5_path) as result_fcd_segmented_h5:
fcd_segmented_metadata = self._populate_fcd_h5(result_fcd_segmented_h5, fcd_segmented,
result_fcd_segmented_index.gid, time_series.gid, sw, sp)
self._populate_fcd_index(result_fcd_segmented_index, time_series.id, fcd_segmented, fcd_segmented_metadata)
result_fcd_segmented_index.gid,
self.input_time_series_index.gid, view_model.sw,
view_model.sp)
self._populate_fcd_index(result_fcd_segmented_index, self.input_time_series_index.id, fcd_segmented,
fcd_segmented_metadata)
result.append(result_fcd_segmented_index)
for mode in eigvect_dict.keys():
......@@ -272,8 +289,8 @@ class FunctionalConnectivityDynamicsAdapter(ABCAsynchronous):
return result
def _compute_fcd_matrix(self, ts_h5):
LOG.debug("timeseries_h5.data")
LOG.debug(narray_describe(ts_h5.data[:]))
self.log.debug("timeseries_h5.data")
self.log.debug(narray_describe(ts_h5.data[:]))
input_shape = ts_h5.data.shape
result_shape = self._result_shape(input_shape)
......@@ -301,8 +318,8 @@ class FunctionalConnectivityDynamicsAdapter(ABCAsynchronous):
fcd[j, i, var, mode] = fcd[i, j, var, mode]
j += 1
LOG.debug("FCD")
LOG.debug(narray_describe(fcd))
self.log.debug("FCD")
self.log.debug(narray_describe(fcd))
num_eig = 3 # number of the eigenvector that will be extracted
......
......@@ -38,31 +38,79 @@ Adapter that uses the traits module to generate interfaces for BalloonModel Anal
import uuid
import numpy
from tvb.analyzers.fmri_balloon import BalloonModel
from tvb.basic.neotraits.api import Float, Attr
from tvb.core.neotraits.view_model import ViewModel, DataTypeGidAttr
from tvb.datatypes.time_series import TimeSeries
from tvb.core.adapters.abcadapter import ABCAsynchronous, ABCAdapterForm
from tvb.core.entities.filters.chain import FilterChain
from tvb.basic.logger.builder import get_logger
from tvb.adapters.datatypes.h5.time_series_h5 import TimeSeriesRegionH5
from tvb.adapters.datatypes.db.time_series import TimeSeriesIndex, TimeSeriesRegionIndex
from tvb.core.neotraits.forms import DataTypeSelectField, ScalarField
from tvb.core.neotraits.forms import ScalarField, TraitDataTypeSelectField
from tvb.core.neotraits.db import prepare_array_shape_meta
from tvb.core.neocom import h5
LOG = get_logger(__name__)
class BalloonModelAdapterModel(ViewModel):
time_series = DataTypeGidAttr(
linked_datatype=TimeSeries,
label="Time Series",
required=True,
doc="""The timeseries that represents the input neural activity"""
)
dt = Float(
label=":math:`dt`",
default=0.002,
required=True,
doc="""The integration time step size for the balloon model (s).
If none is provided, by default, the TimeSeries sample period is used."""
)
neural_input_transformation = Attr(
field_type=str,
label="Neural input transformation",
choices=("none", "abs_diff", "sum"),
default="none",
doc=""" This represents the operation to perform on the state-variable(s) of
the model used to generate the input TimeSeries. ``none`` takes the
first state-variable as neural input; `` abs_diff`` is the absolute
value of the derivative (first order difference) of the first state variable;
``sum``: sum all the state-variables of the input TimeSeries."""
)
bold_model = Attr(
field_type=str,
label="Select BOLD model equations",
choices=("linear", "nonlinear"),
default="nonlinear",
doc="""Select the set of equations for the BOLD model."""
)
RBM = Attr(
field_type=bool,
label="Revised BOLD Model",
default=True,
required=True,
doc="""Select classical vs revised BOLD model (CBM or RBM).
Coefficients k1, k2 and k3 will be derived accordingly."""
)
class BalloonModelAdapterForm(ABCAdapterForm):
def __init__(self, prefix='', project_id=None):
super(BalloonModelAdapterForm, self).__init__(prefix, project_id)
self.time_series = DataTypeSelectField(self.get_required_datatype(), self, name=self.get_input_name(),
required=True, label=BalloonModel.time_series.label,
doc=BalloonModel.time_series.doc, conditions=self.get_filters(),
has_all_option=True)
self.dt = ScalarField(BalloonModel.dt, self)
self.neural_input_transformation = ScalarField(BalloonModel.neural_input_transformation, self)
self.bold_model = ScalarField(BalloonModel.bold_model, self)
self.RBM = ScalarField(BalloonModel.RBM, self)
self.time_series = TraitDataTypeSelectField(BalloonModelAdapterModel.time_series, self,
name=self.get_input_name(),
conditions=self.get_filters(), has_all_option=True)
self.dt = ScalarField(BalloonModelAdapterModel.dt, self)
self.neural_input_transformation = ScalarField(BalloonModelAdapterModel.neural_input_transformation, self)
self.bold_model = ScalarField(BalloonModelAdapterModel.bold_model, self)
self.RBM = ScalarField(BalloonModelAdapterModel.RBM, self)
@staticmethod
def get_view_model():
return BalloonModelAdapterModel
@staticmethod
def get_required_datatype():
......@@ -95,36 +143,38 @@ class BalloonModelAdapter(ABCAsynchronous):
def get_output(self):
return [TimeSeriesRegionIndex]
def configure(self, time_series, dt=None, bold_model=None, RBM=None, neural_input_transformation=None):
def configure(self, view_model):
# type: (BalloonModelAdapterModel) -> None
"""
Store the input shape to be later used to estimate memory usage. Also
create the algorithm instance.
"""
self.input_time_series_index = time_series
self.input_time_series_index = self.load_entity_by_gid(view_model.time_series.hex)
self.input_shape = (self.input_time_series_index.data_length_1d,
self.input_time_series_index.data_length_2d,
self.input_time_series_index.data_length_3d,
self.input_time_series_index.data_length_4d)
LOG.debug("time_series shape is %s" % str(self.input_shape))
self.log.debug("time_series shape is %s" % str(self.input_shape))
# -------------------- Fill Algorithm for Analysis -------------------##
algorithm = BalloonModel()
if dt is not None:
algorithm.dt = dt
if view_model.dt is not None:
algorithm.dt = view_model.dt
else:
algorithm.dt = time_series.sample_period / 1000.
algorithm.dt = self.input_time_series_index.sample_period / 1000.
if bold_model is not None:
algorithm.bold_model = bold_model
if RBM is not None:
algorithm.RBM = RBM
if neural_input_transformation is not None:
algorithm.neural_input_transformation = neural_input_transformation
if view_model.bold_model is not None:
algorithm.bold_model = view_model.bold_model
if view_model.RBM is not None:
algorithm.RBM = view_model.RBM
if view_model.neural_input_transformation is not None:
algorithm.neural_input_transformation = view_model.neural_input_transformation
self.algorithm = algorithm
def get_required_memory_size(self, **kwargs):