Intialize_Sampleworking_directory00 /Users/jdsant/Documents/Wf4Ever/2ndGoldenExemplar/python 2012-09-26 17:25:25.178 UTC POSIX Path to the working directory were data will be saved 2012-09-26 17:20:10.213 UTC postgresql_server_ip00 Hostname or IP adress of the PostgreSQL server 2012-09-26 17:24:52.211 UTC amiga.iaa.csic.es 2012-09-26 17:24:59.750 UTC postgresql_server_port00 TCP/IP port number of the PostgresSQL server (default: 5432) 2012-09-26 17:24:26.487 UTC 5432 2012-09-26 17:24:29.559 UTC db_username00 Database username 2012-09-26 17:26:08.489 UTC sdss 2012-09-26 17:26:11.553 UTC db_password00 password 2012-09-26 17:26:43.386 UTC Database password for db_username 2012-09-26 17:26:37.734 UTC pickle_filename00 par_low.pckl 2012-09-26 20:26:20.132 UTC Name of the pickle file to save the initial sample 2012-09-26 20:26:38.751 UTC path_to_galaxy_sampleTool_STDOUTTool_STDERR2par_low.pydbpasswd0dbuser0tcpport0host0pickle_filename0working_directory0STDOUT00STDERR00net.sf.taverna.t2.activitiesexternal-tool-activity1.4net.sf.taverna.t2.activities.externaltool.ExternalToolActivity 789663B8-DA91-428A-9F7D-B3F3DA185FD4 default local <?xml version="1.0" encoding="UTF-8"?> <localInvocation><shellPrefix>/bin/sh -c</shellPrefix><linkCommand>/bin/ln -s %%PATH_TO_ORIGINAL%% %%TARGET_NAME%%</linkCommand></localInvocation> 79ead3ec-f680-4967-a0bb-6509aa9c2f05 2par_low.py Python script Python script that initializes Pre-requisites - PostgreSQL db on _host_ server at _port_TCP port accessable with _dbuser_ and _dbpasswd_ authorisation, and a gal table at schema sdss_dr7 with fields r, pz2, e_pz2, pz, e_pz, and bsz. python 2par_low.py \ --host %%host%% \ --user %%dbuser%% \ --pass %%dbpasswd%% \ --port %%tcpport%% \ --file %%pickle_filename%% \ --work-directory %%working_directory%% 1200 1800 dbpasswd dbuser host pickle_filename tcpport working_directory 2par_low.py true false false MacRoman false #!/usr/bin/env python # -*- coding: utf-8 -*- import psycopg2 as pg import numpy import pylab import time import pickle import os default_schema="sdss_dr7" from optparse import OptionParser parser = OptionParser() parser.add_option("-F", "--file", dest="filename", help="write sample to FILE", default="par_low.pckl", metavar="FILE") parser.add_option("-U", "--user", dest="dbuser", help="username in gal database") parser.add_option("-P", "--pass", dest="dbpasswd", help="password for user in gal database", metavar="user") parser.add_option("-H", "--host", dest="host", help="hostname of the database server") parser.add_option("-p", "--port", dest="port", help="TCP port of the database server", default=5432, metavar="port") parser.add_option("-S", "--schema", dest="schema", help="database SCHEMA (defaults to %s)" % default_schema, default=default_schema, metavar="SCHEMA") parser.add_option("-w", "--work-directory", dest="wd", help="use WD as working directory", default=os.getcwd(), metavar="WD") (options, args) = parser.parse_args() if options.port != None: options.port = int(options.port) optionsAreNotValid = ( options.dbuser == None or options.dbpasswd == None or options.host == None or options.host == "" or options.port == None or type(options.port) != type(5432) or options.filename == None or options.filename == "" or options.schema == None or options.schema == "" or options.wd == None or options.wd == "" ) if optionsAreNotValid: raise ValueError("Options not valid") def obf(cadena): if cadena == None: a = -1. else: a = float(cadena) return a db = pg.connect( database=options.schema, host=options.host, user=options.dbuser, password=options.dbpasswd, port=options.port ) c = db.cursor() t0 = time.time() c.execute(""" SELECT g.r, g.pz2, g.e_pz2, g.pz, g.e_pz FROM gal g WHERE g.bsz IS NULL """) db.commit() n = c.rowcount t = time.time() print "%i tiempo: %f5.2 s"%(n,t-t0) t0 = t #r = numpy.zeros(n,dtype='Float64') #pz2 = numpy.zeros(n,dtype='Float64') #e_pz2 = numpy.zeros(n,dtype='Float64') #pz = numpy.zeros(n,dtype='Float64') #e_pz = numpy.zeros(n,dtype='Float64') #pzf = numpy.zeros(n,dtype='Float64') #e_pzf = numpy.zeros(n,dtype='Float64') #codf = numpy.zeros(n,dtype='Int32') r = numpy.zeros(n,dtype='Float32') pz2 = numpy.zeros(n,dtype='Float32') e_pz2 = numpy.zeros(n,dtype='Float32') pz = numpy.zeros(n,dtype='Float32') e_pz = numpy.zeros(n,dtype='Float32') pzf = numpy.zeros(n,dtype='Float32') e_pzf = numpy.zeros(n,dtype='Float32') codf = numpy.zeros(n,dtype='Int16') for i in range(n): record = c.fetchone() r[i] = float(record[0]) pz2[i] = obf(record[1]) e_pz2[i] = obf(record[2]) pz[i] = obf(record[3]) e_pz[i] = obf(record[4]) if (r[i] >= 17.77)and(pz2[i] != -1.): pzf[i] = pz2[i] e_pzf[i] = e_pz2[i] elif (r[i] >= 17.77)and(pz2[i] == -1.)and(pz[i]!= -1.): pzf[i] = pz[i] e_pzf[i] = e_pz[i] elif (r[i] >= 17.77)and(pz2[i] == -1.)and(pz[i]== -1.): pzf[i] = -1. e_pzf[i] = -1. elif (r[i] < 17.77)and(pz[i] != -1.): pzf[i] = pz[i] e_pzf[i] = e_pz[i] elif (r[i] < 17.77)and(pz[i] == -1.)and(pz2[i] != -1.): pzf[i] = pz2[i] e_pzf[i] = e_pz2[i] elif (r[i] < 17.77)and(pz[i] == -1.)and(pz2[i] == -1.): pzf[i] = -1. e_pzf[i] = -1. t = time.time() print "tiempo: %f5.2 s"%(t-t0) t0 = t rp = numpy.arange(14.5,22.5,0.5) # Limits of the magnitude r N = numpy.arange(0.1,3.2,0.2) # Limits of sigmas z = numpy.arange(0.00,0.11,0.01) # Limits of redshift par = numpy.zeros((len(N),len(rp),len(z)),dtype="Int32") t1 = time.time() for i,zo in enumerate(z): for j,No in enumerate(N): for k,rpo in enumerate(rp): b = len(numpy.where((r < rpo)&(numpy.absolute(pzf-zo)/e_pzf < No)&(pzf != -1.))[0]) par[j,k,i] = b t = time.time() print "z: %5.2f; N: %5.2f; r: %5.2f - %i (time: %5.2f)"%(zo,No,rpo,b,t-t1) t1 = t # t = time.time() print "tiempo: %f5.2 s"%(t-t0) t0 = t cwd = os.getcwd() os.chdir(options.wd) fullwd = os.getcwd() pickle.dump(par,open(options.filename,"wb")) print "Pickle file dumped on:" print "%s/%s" % (fullwd, options.filename) host host false false false MacRoman false false false working_directory working_directory false false false MacRoman false false false dbpasswd dbpasswd false false false MacRoman false false false tcpport tcpport false false false MacRoman false false false dbuser dbuser false false false MacRoman false false false pickle_filename pickle_filename false false false MacRoman false false false false true true 0 false net.sf.taverna.t2.coreworkflowmodel-impl1.4net.sf.taverna.t2.workflowmodel.processor.dispatch.layers.Parallelize 1 net.sf.taverna.t2.coreworkflowmodel-impl1.4net.sf.taverna.t2.workflowmodel.processor.dispatch.layers.ErrorBouncenet.sf.taverna.t2.coreworkflowmodel-impl1.4net.sf.taverna.t2.workflowmodel.processor.dispatch.layers.Failovernet.sf.taverna.t2.coreworkflowmodel-impl1.4net.sf.taverna.t2.workflowmodel.processor.dispatch.layers.Retry 1.0 1000 5000 0 net.sf.taverna.t2.coreworkflowmodel-impl1.4net.sf.taverna.t2.workflowmodel.processor.dispatch.layers.Invokeget_last_line_of_STDINSTDIN0STDOUT00net.sf.taverna.t2.activitiesexternal-tool-activity1.4net.sf.taverna.t2.activities.externaltool.ExternalToolActivity 789663B8-DA91-428A-9F7D-B3F3DA185FD4 default local <?xml version="1.0" encoding="UTF-8"?> <localInvocation><shellPrefix>/bin/sh -c</shellPrefix><linkCommand>/bin/ln -s %%PATH_TO_ORIGINAL%% %%TARGET_NAME%%</linkCommand></localInvocation> 2e372bd6-ba1f-4264-9e1d-ad0256a2ea10 tail -n 1 1200 1800 true true true 0 false net.sf.taverna.t2.coreworkflowmodel-impl1.4net.sf.taverna.t2.workflowmodel.processor.dispatch.layers.Parallelize 1 net.sf.taverna.t2.coreworkflowmodel-impl1.4net.sf.taverna.t2.workflowmodel.processor.dispatch.layers.ErrorBouncenet.sf.taverna.t2.coreworkflowmodel-impl1.4net.sf.taverna.t2.workflowmodel.processor.dispatch.layers.Failovernet.sf.taverna.t2.coreworkflowmodel-impl1.4net.sf.taverna.t2.workflowmodel.processor.dispatch.layers.Retry 1.0 1000 5000 0 net.sf.taverna.t2.coreworkflowmodel-impl1.4net.sf.taverna.t2.workflowmodel.processor.dispatch.layers.Invoke2par_low.pydbpasswddb_password2par_low.pydbuserdb_username2par_low.pytcpportpostgresql_server_port2par_low.pyhostpostgresql_server_ip2par_low.pypickle_filenamepickle_filename2par_low.pyworking_directoryworking_directoryget_last_line_of_STDINSTDIN2par_low.pySTDOUTpath_to_galaxy_sampleget_last_line_of_STDINSTDOUTTool_STDOUT2par_low.pySTDOUTTool_STDERR2par_low.pySTDERR 08fe28f1-94b9-4488-8e6c-fec8d6a14f99 2013-01-04 20:14:14.281 UTC This workflow saves a tabular *.pckl Python pickle dataset in the local file system, containing values calculated on physical parameters associated to potential companions of a sample of target galaxies. These original physical parameters are extracted from a postgreSQL database, containing information of all galaxies covered by the SDSS spectroscopic survey. The workflow first access the external database located in the AMIGA server and selects the target galaxies from the sample (those having spectroscopic redshift between 0.03 and 0.1). It then creates a tabular gridded datacube with values associated to potential neighbours. These values are calculates for each point of a 3D space defined by the axes: magnitude in r band, photometric redshift and sigma level of detection. The input default values to build the parameterised datacube are: - 14.5<mr<22.5 - step 0.5 - 0 <z<0.11 - step 0.01 - 0.1<sigma<3.2 - step 0.2 Auxiliary function libraries and scripts are also copied in local file system, and the PYTHONPATH environmental variable is set to a value provided by the user as the Working Path of the digital experiment. Other user provided input values are the database connection settings: hostname, login and password. Execution environment The first requirement to run the workflows provided by both ROs is Taverna Workbench2 2.4 or higher. AstroTaverna (Taverna plugin) is also needed in order to get functionalities related with Virtual Observatory web services queries and management of standard VOTable data formats. In general, the execution environment is a Linux distribution including Python4 2.x and a bash shell, with psycopg and numpy Python packages. Access to a PostgreSQL database storing the physical parameters provided by SDSS is also needed; a dump file of database may be downloaded from the AMIGA web server and in order to be deployed and accessible from a local execution environment. 2013-01-04 20:39:03.322 UTC f9b001af-a9dc-482c-8994-bb4d1dd0e95e 2013-01-04 20:39:07.708 UTC 2611e23f-05ef-47d3-a29d-b6b085bc42de 2013-01-04 20:21:48.330 UTC 97635c93-19a8-4212-bd8a-f887c63283ba 2012-09-27 08:26:42.221 UTC 4be1ad0a-613d-4abe-87a2-bf22df4d721f 2012-09-26 20:20:11.502 UTC 5464b9f0-21cf-4056-8c78-56dd81774222 2012-09-27 07:36:37.251 UTC 9afb3cc8-16ef-47ca-87f7-74e35f8aca72 2012-09-27 08:28:17.381 UTC 490ea08b-5d94-4fe2-8fdf-cff012b132c0 2012-09-27 07:41:04.812 UTC 68c07933-38f7-4e55-8bda-3260c9c3424d 2012-09-26 20:40:06.850 UTC Juande Santander-Vela 2012-09-26 20:25:45.588 UTC e89f888d-c40b-4655-93c5-bca26bcd4e1e 2012-09-28 11:32:12.385 UTC d501b054-566e-4618-b8c8-59df7e4f88a9 2012-09-27 08:31:44.86 UTC 2ff51982-39d9-4455-b7d7-a0e50d7e0af7 2012-09-27 08:30:24.383 UTC fa9793de-75e3-44bf-bf5e-8578c7a78c40 2012-09-26 20:22:18.118 UTC 4c600360-3488-4475-a9d1-a0ffad1a1582 2012-09-26 20:29:08.669 UTC Intialize Sample 2013-01-04 20:21:37.611 UTC