Package Gnumed :: Package pycommon :: Module gmBusinessDBObject
[frames] | no frames]

Source Code for Module Gnumed.pycommon.gmBusinessDBObject

  1  """GNUmed database object business class. 
  2   
  3  Overview 
  4  -------- 
  5  This class wraps a source relation (table, view) which 
  6  represents an entity that makes immediate business sense 
  7  such as a vaccination or a medical document. In many if 
  8  not most cases this source relation is a denormalizing 
  9  view. The data in that view will in most cases, however, 
 10  originate from several normalized tables. One instance 
 11  of this class represents one row of said source relation. 
 12   
 13  Note, however, that this class does not *always* simply 
 14  wrap a single table or view. It can also encompass several 
 15  relations (views, tables, sequences etc) that taken together 
 16  form an object meaningful to *business* logic. 
 17   
 18  Initialization 
 19  -------------- 
 20  There are two ways to initialize an instance with values. 
 21  One way is to pass a "primary key equivalent" object into 
 22  __init__(). Refetch_payload() will then pull the data from 
 23  the backend. Another way would be to fetch the data outside 
 24  the instance and pass it in via the <row> argument. In that 
 25  case the instance will not initially connect to the databse 
 26  which may offer a great boost to performance. 
 27   
 28  Values API 
 29  ---------- 
 30  Field values are cached for later access. They can be accessed 
 31  by a dictionary API, eg: 
 32   
 33          old_value = object['field'] 
 34          object['field'] = new_value 
 35   
 36  The field names correspond to the respective column names 
 37  in the "main" source relation. Accessing non-existant field 
 38  names will raise an error, so does trying to set fields not 
 39  listed in self.__class__._updatable_fields. To actually 
 40  store updated values in the database one must explicitly 
 41  call save_payload(). 
 42   
 43  The class will in many cases be enhanced by accessors to 
 44  related data that is not directly part of the business 
 45  object itself but are closely related, such as codes 
 46  linked to a clinical narrative entry (eg a diagnosis). Such 
 47  accessors in most cases start with get_*. Related setters 
 48  start with set_*. The values can be accessed via the 
 49  object['field'] syntax, too, but they will be cached 
 50  independantly. 
 51   
 52  Concurrency handling 
 53  -------------------- 
 54  GNUmed connections always run transactions in isolation level 
 55  "serializable". This prevents transactions happening at the 
 56  *very same time* to overwrite each other's data. All but one 
 57  of them will abort with a concurrency error (eg if a 
 58  transaction runs a select-for-update later than another one 
 59  it will hang until the first transaction ends. Then it will 
 60  succeed or fail depending on what the first transaction 
 61  did). This is standard transactional behaviour. 
 62   
 63  However, another transaction may have updated our row 
 64  between the time we first fetched the data and the time we 
 65  start the update transaction. This is noticed by getting the 
 66  XMIN system column for the row when initially fetching the 
 67  data and using that value as a where condition value when 
 68  updating the row later. If the row had been updated (xmin 
 69  changed) or deleted (primary key disappeared) in the 
 70  meantime the update will touch zero rows (as no row with 
 71  both PK and XMIN matching is found) even if the query itself 
 72  syntactically succeeds. 
 73   
 74  When detecting a change in a row due to XMIN being different 
 75  one needs to be careful how to represent that to the user. 
 76  The row may simply have changed but it also might have been 
 77  deleted and a completely new and unrelated row which happens 
 78  to have the same primary key might have been created ! This 
 79  row might relate to a totally different context (eg. patient, 
 80  episode, encounter). 
 81   
 82  One can offer all the data to the user: 
 83   
 84  self.original_payload 
 85  - contains the data at the last successful refetch 
 86   
 87  self.modified_payload 
 88  - contains the modified payload just before the last 
 89    failure of save_payload() - IOW what is currently 
 90    in the database 
 91   
 92  self._payload 
 93  - contains the currently active payload which may or 
 94    may not contain changes 
 95   
 96  For discussion on this see the thread starting at: 
 97   
 98          http://archives.postgresql.org/pgsql-general/2004-10/msg01352.php 
 99   
100  and here 
101   
102          http://groups.google.com/group/pgsql.general/browse_thread/thread/e3566ba76173d0bf/6cf3c243a86d9233 
103          (google for "XMIN semantic at peril") 
104   
105  Problem cases with XMIN: 
106   
107  1) not unlikely 
108  - a very old row is read with XMIN 
109  - vacuum comes along and sets XMIN to FrozenTransactionId 
110    - now XMIN changed but the row actually didn't ! 
111  - an update with "... where xmin = old_xmin ..." fails 
112    although there is no need to fail 
113   
114  2) quite unlikely 
115  - a row is read with XMIN 
116  - a long time passes 
117  - the original XMIN gets frozen to FrozenTransactionId 
118  - another writer comes along and changes the row 
119  - incidentally the exact same old row gets the old XMIN *again* 
120    - now XMIN is (again) the same but the data changed ! 
121  - a later update fails to detect the concurrent change !! 
122   
123  TODO: 
124  The solution is to use our own column for optimistic locking 
125  which gets updated by an AFTER UPDATE trigger. 
126  """ 
127  #============================================================ 
128  __author__ = "K.Hilbert <Karsten.Hilbert@gmx.net>" 
129  __license__ = "GPL v2 or later" 
130   
131   
132  import sys 
133  import types 
134  import inspect 
135  import logging 
136  import datetime 
137   
138   
139  if __name__ == '__main__': 
140          sys.path.insert(0, '../../') 
141  from Gnumed.pycommon import gmExceptions 
142  from Gnumed.pycommon import gmPG2 
143  from Gnumed.pycommon.gmDateTime import pydt_strftime 
144  from Gnumed.pycommon.gmTools import tex_escape_string, xetex_escape_string 
145   
146   
147  _log = logging.getLogger('gm.db') 
148  #============================================================ 
149 -class cBusinessDBObject(object):
150 """Represents business objects in the database. 151 152 Rules: 153 - instances ARE ASSUMED TO EXIST in the database 154 - PK construction (aPK_obj): DOES verify its existence on instantiation 155 (fetching data fails) 156 - Row construction (row): allowed by using a dict of pairs 157 field name: field value (PERFORMANCE improvement) 158 - does NOT verify FK target existence 159 - does NOT create new entries in the database 160 - does NOT lazy-fetch fields on access 161 162 Class scope SQL commands and variables: 163 164 <_cmd_fetch_payload> 165 - must return exactly one row 166 - where clause argument values are expected 167 in self.pk_obj (taken from __init__(aPK_obj)) 168 - must return xmin of all rows that _cmds_store_payload 169 will be updating, so views must support the xmin columns 170 of their underlying tables 171 172 <_cmds_store_payload> 173 - one or multiple "update ... set ... where xmin_* = ... and pk* = ..." 174 statements which actually update the database from the data in self._payload, 175 - the last query must refetch at least the XMIN values needed to detect 176 concurrent updates, their field names had better be the same as 177 in _cmd_fetch_payload, 178 - the last query CAN return other fields which is particularly 179 useful when those other fields are computed in the backend 180 and may thus change upon save but will not have been set by 181 the client code explicitely - this is only really of concern 182 if the saved subclass is to be reused after saving rather 183 than re-instantiated 184 - when subclasses tend to live a while after save_payload() was 185 called and they support computed fields (say, _(some_column) 186 you need to return *all* columns (see cEncounter) 187 188 <_updatable_fields> 189 - a list of fields available for update via object['field'] 190 191 192 A template for new child classes: 193 194 *********** start of template *********** 195 196 #------------------------------------------------------------ 197 from Gnumed.pycommon import gmBusinessDBObject 198 from Gnumed.pycommon import gmPG2 199 200 #============================================================ 201 # short description 202 #------------------------------------------------------------ 203 # search/replace "" " -> 3 "s 204 # 205 # use plural form, search-replace get_XXX 206 _SQL_get_XXX = u"" " 207 SELECT *, (xmin AS xmin_XXX) 208 FROM XXX.v_XXX 209 WHERE %s 210 "" " 211 212 class cXxxXxx(gmBusinessDBObject.cBusinessDBObject): 213 "" "Represents ..."" " 214 215 _cmd_fetch_payload = _SQL_get_XXX % u"pk_XXX = %s" 216 _cmds_store_payload = [ 217 u"" " 218 -- typically the underlying table name 219 UPDATE xxx.xxx SET 220 -- typically "table_col = %(view_col)s" 221 xxx = %(xxx)s, 222 xxx = gm.nullify_empty_string(%(xxx)s) 223 WHERE 224 pk = %(pk_XXX)s 225 AND 226 xmin = %(xmin_XXX)s 227 RETURNING 228 xmin as xmin_XXX 229 --, ... 230 --, ... 231 "" " 232 ] 233 # view columns that can be updated: 234 _updatable_fields = [ 235 u'xxx', 236 u'xxx' 237 ] 238 #-------------------------------------------------------- 239 # def format(self): 240 # return u'%s' % self 241 242 #------------------------------------------------------------ 243 def get_XXX(order_by=None): 244 if order_by is None: 245 order_by = u'true' 246 else: 247 order_by = u'true ORDER BY %s' % order_by 248 249 cmd = _SQL_get_XXX % order_by 250 rows, idx = gmPG2.run_ro_queries(queries = [{'cmd': cmd}], get_col_idx = True) 251 return [ cXxxXxx(row = {'data': r, 'idx': idx, 'pk_field': 'pk_xxx'}) for r in rows ] 252 #------------------------------------------------------------ 253 def create_xxx(xxx=None, xxx=None): 254 255 args = { 256 u'xxx': xxx, 257 u'xxx': xxx 258 } 259 cmd = u"" " 260 INSERT INTO xxx.xxx ( 261 xxx, 262 xxx, 263 xxx 264 ) VALUES ( 265 %(xxx)s, 266 %(xxx)s, 267 gm.nullify_empty_string(%(xxx)s) 268 ) 269 RETURNING pk 270 --RETURNING * 271 "" " 272 rows, idx = gmPG2.run_rw_queries(queries = [{'cmd': cmd, 'args': args}], return_data = True, get_col_idx = False) 273 #rows, idx = gmPG2.run_rw_queries(queries = [{'cmd': cmd, 'args': args}], return_data = True, get_col_idx = True) 274 275 return cXxxXxx(aPK_obj = rows[0]['pk']) 276 #return cXxxXxx(row = {'data': r, 'idx': idx, 'pk_field': 'pk_XXX'}) 277 #------------------------------------------------------------ 278 def delete_xxx(pk_xxx=None): 279 args = {'pk': pk_xxx} 280 cmd = u"DELETE FROM xxx.xxx WHERE pk = %(pk)s" 281 gmPG2.run_rw_queries(queries = [{'cmd': cmd, 'args': args}]) 282 return True 283 #------------------------------------------------------------ 284 285 *********** end of template *********** 286 287 """ 288 #--------------------------------------------------------
289 - def __init__(self, aPK_obj=None, row=None):
290 """Init business object. 291 292 Call from child classes: 293 294 super(cChildClass, self).__init__(aPK_obj = aPK_obj, row = row) 295 """ 296 # initialize those "too early" because checking descendants might 297 # fail which will then call __str__ in stack trace logging if --debug 298 # was given which in turn needs those instance variables 299 self.pk_obj = '<uninitialized>' 300 self._idx = {} 301 self._payload = [] # the cache for backend object values (mainly table fields) 302 self._ext_cache = {} # the cache for extended method's results 303 self._is_modified = False 304 305 # check descendants 306 self.__class__._cmd_fetch_payload 307 self.__class__._cmds_store_payload 308 self.__class__._updatable_fields 309 310 if aPK_obj is not None: 311 self.__init_from_pk(aPK_obj=aPK_obj) 312 else: 313 self._init_from_row_data(row=row) 314 315 self._is_modified = False
316 #--------------------------------------------------------
317 - def __init_from_pk(self, aPK_obj=None):
318 """Creates a new clinical item instance by its PK. 319 320 aPK_obj can be: 321 - a simple value 322 * the primary key WHERE condition must be 323 a simple column 324 - a dictionary of values 325 * the primary key where condition must be a 326 subselect consuming the dict and producing 327 the single-value primary key 328 """ 329 self.pk_obj = aPK_obj 330 result = self.refetch_payload() 331 if result is True: 332 self.original_payload = {} 333 for field in self._idx.keys(): 334 self.original_payload[field] = self._payload[self._idx[field]] 335 return True 336 337 if result is False: 338 raise gmExceptions.ConstructorError, "[%s:%s]: error loading instance" % (self.__class__.__name__, self.pk_obj)
339 #--------------------------------------------------------
340 - def _init_from_row_data(self, row=None):
341 """Creates a new clinical item instance given its fields. 342 343 row must be a dict with the fields: 344 - pk_field: the name of the primary key field 345 - idx: a dict mapping field names to position 346 - data: the field values in a list (as returned by 347 cursor.fetchone() in the DB-API) 348 349 row = {'data': row, 'idx': idx, 'pk_field': 'the PK column name'} 350 351 rows, idx = gmPG2.run_ro_queries(queries = [{'cmd': cmd, 'args': args}], get_col_idx = True) 352 objects = [ cChildClass(row = {'data': r, 'idx': idx, 'pk_field': 'the PK column name'}) for r in rows ] 353 """ 354 try: 355 self._idx = row['idx'] 356 self._payload = row['data'] 357 self.pk_obj = self._payload[self._idx[row['pk_field']]] 358 except: 359 _log.exception('faulty <row> argument structure: %s' % row) 360 raise gmExceptions.ConstructorError, "[%s:??]: error loading instance from row data" % self.__class__.__name__ 361 362 if len(self._idx.keys()) != len(self._payload): 363 _log.critical('field index vs. payload length mismatch: %s field names vs. %s fields' % (len(self._idx.keys()), len(self._payload))) 364 _log.critical('faulty <row> argument structure: %s' % row) 365 raise gmExceptions.ConstructorError, "[%s:??]: error loading instance from row data" % self.__class__.__name__ 366 367 self.original_payload = {} 368 for field in self._idx.keys(): 369 self.original_payload[field] = self._payload[self._idx[field]]
370 #--------------------------------------------------------
371 - def __del__(self):
372 if self.__dict__.has_key('_is_modified'): 373 if self._is_modified: 374 _log.critical('[%s:%s]: loosing payload changes' % (self.__class__.__name__, self.pk_obj)) 375 _log.debug('original: %s' % self.original_payload) 376 _log.debug('modified: %s' % self._payload)
377 #--------------------------------------------------------
378 - def __str__(self):
379 tmp = [] 380 try: 381 for attr in self._idx.keys(): 382 if self._payload[self._idx[attr]] is None: 383 tmp.append('%s: NULL' % attr) 384 else: 385 tmp.append('%s: >>%s<<' % (attr, self._payload[self._idx[attr]])) 386 return '[%s:%s]: %s' % (self.__class__.__name__, self.pk_obj, str(tmp)) 387 #return '[%s:%s]:\n %s' % (self.__class__.__name__, self.pk_obj, '\n '.join(lines)) 388 except: 389 return 'nascent [%s @ %s], cannot show payload and primary key' %(self.__class__.__name__, id(self))
390 #--------------------------------------------------------
391 - def __unicode__(self):
392 lines = [] 393 try: 394 for attr in self._idx.keys(): 395 if self._payload[self._idx[attr]] is None: 396 lines.append(u'%s: NULL' % attr) 397 else: 398 lines.append('%s: %s' % (attr, self._payload[self._idx[attr]])) 399 return '[%s:%s]:\n%s' % (self.__class__.__name__, self.pk_obj, u'\n'.join(lines)) 400 except: 401 return 'nascent [%s @ %s], cannot show payload and primary key' %(self.__class__.__name__, id(self))
402 #--------------------------------------------------------
403 - def __getitem__(self, attribute):
404 # use try: except: as it is faster and we want this as fast as possible 405 406 # 1) backend payload cache 407 try: 408 return self._payload[self._idx[attribute]] 409 except KeyError: 410 pass 411 412 # 2) extension method results ... 413 getter = getattr(self, 'get_%s' % attribute, None) 414 if not callable(getter): 415 _log.warning('[%s]: no attribute [%s]' % (self.__class__.__name__, attribute)) 416 _log.warning('[%s]: valid attributes: %s' % (self.__class__.__name__, str(self._idx.keys()))) 417 _log.warning('[%s]: no getter method [get_%s]' % (self.__class__.__name__, attribute)) 418 methods = filter(lambda x: x[0].startswith('get_'), inspect.getmembers(self, inspect.ismethod)) 419 _log.warning('[%s]: valid getter methods: %s' % (self.__class__.__name__, str(methods))) 420 raise KeyError('[%s]: cannot read from key [%s]' % (self.__class__.__name__, attribute)) 421 422 self._ext_cache[attribute] = getter() 423 return self._ext_cache[attribute]
424 #--------------------------------------------------------
425 - def __setitem__(self, attribute, value):
426 427 # 1) backend payload cache 428 if attribute in self.__class__._updatable_fields: 429 try: 430 if self._payload[self._idx[attribute]] != value: 431 self._payload[self._idx[attribute]] = value 432 self._is_modified = True 433 return 434 except KeyError: 435 _log.warning('[%s]: cannot set attribute <%s> despite marked settable' % (self.__class__.__name__, attribute)) 436 _log.warning('[%s]: supposedly settable attributes: %s' % (self.__class__.__name__, str(self.__class__._updatable_fields))) 437 raise KeyError('[%s]: cannot write to key [%s]' % (self.__class__.__name__, attribute)) 438 439 # 2) setters providing extensions 440 if hasattr(self, 'set_%s' % attribute): 441 setter = getattr(self, "set_%s" % attribute) 442 if not callable(setter): 443 raise AttributeError('[%s] setter [set_%s] not callable' % (self.__class__.__name__, attribute)) 444 try: 445 del self._ext_cache[attribute] 446 except KeyError: 447 pass 448 if type(value) is types.TupleType: 449 if setter(*value): 450 self._is_modified = True 451 return 452 raise AttributeError('[%s]: setter [%s] failed for [%s]' % (self.__class__.__name__, setter, value)) 453 if setter(value): 454 self._is_modified = True 455 return 456 457 # 3) don't know what to do with <attribute> 458 _log.error('[%s]: cannot find attribute <%s> or setter method [set_%s]' % (self.__class__.__name__, attribute, attribute)) 459 _log.warning('[%s]: settable attributes: %s' % (self.__class__.__name__, str(self.__class__._updatable_fields))) 460 methods = filter(lambda x: x[0].startswith('set_'), inspect.getmembers(self, inspect.ismethod)) 461 _log.warning('[%s]: valid setter methods: %s' % (self.__class__.__name__, str(methods))) 462 raise AttributeError('[%s]: cannot set [%s]' % (self.__class__.__name__, attribute))
463 #-------------------------------------------------------- 464 # external API 465 #--------------------------------------------------------
466 - def same_payload(self, another_object=None):
467 raise NotImplementedError('comparison between [%s] and [%s] not implemented' % (self, another_object))
468 #--------------------------------------------------------
469 - def is_modified(self):
470 return self._is_modified
471 #--------------------------------------------------------
472 - def get_fields(self):
473 try: 474 return self._idx.keys() 475 except AttributeError: 476 return 'nascent [%s @ %s], cannot return keys' %(self.__class__.__name__, id(self))
477 #--------------------------------------------------------
478 - def get_updatable_fields(self):
479 return self.__class__._updatable_fields
480 #--------------------------------------------------------
481 - def fields_as_dict(self, date_format='%Y %b %d %H:%M', none_string=u'', escape_style=None, bool_strings=None):
482 if bool_strings is None: 483 bools = {True: u'true', False: u'false'} 484 else: 485 bools = {True: bool_strings[0], False: bool_strings[1]} 486 data = {} 487 for field in self._idx.keys(): 488 # FIXME: harden against BYTEA fields 489 #if type(self._payload[self._idx[field]]) == ... 490 # data[field] = _('<%s bytes of binary data>') % len(self._payload[self._idx[field]]) 491 # continue 492 val = self._payload[self._idx[field]] 493 if val is None: 494 data[field] = none_string 495 continue 496 if isinstance(val, bool): 497 data[field] = bools[val] 498 continue 499 500 if isinstance(val, datetime.datetime): 501 data[field] = pydt_strftime(val, format = date_format, encoding = 'utf8') 502 if escape_style in [u'latex', u'tex']: 503 data[field] = tex_escape_string(data[field]) 504 elif escape_style in [u'xetex', u'xelatex']: 505 data[field] = xetex_escape_string(data[field]) 506 continue 507 508 try: 509 data[field] = unicode(val, encoding = 'utf8', errors = 'replace') 510 except TypeError: 511 try: 512 data[field] = unicode(val) 513 except (UnicodeDecodeError, TypeError): 514 val = '%s' % str(val) 515 data[field] = val.decode('utf8', 'replace') 516 if escape_style in [u'latex', u'tex']: 517 data[field] = tex_escape_string(data[field]) 518 elif escape_style in [u'xetex', u'xelatex']: 519 data[field] = xetex_escape_string(data[field]) 520 521 return data
522 #--------------------------------------------------------
523 - def get_patient(self):
524 _log.error('[%s:%s]: forgot to override get_patient()' % (self.__class__.__name__, self.pk_obj)) 525 return None
526 #--------------------------------------------------------
527 - def format(self):
528 return u'%s' % self
529 #--------------------------------------------------------
530 - def refetch_payload(self, ignore_changes=False):
531 """Fetch field values from backend. 532 """ 533 if self._is_modified: 534 if ignore_changes: 535 _log.critical('[%s:%s]: loosing payload changes' % (self.__class__.__name__, self.pk_obj)) 536 _log.debug('original: %s' % self.original_payload) 537 _log.debug('modified: %s' % self._payload) 538 else: 539 _log.critical('[%s:%s]: cannot reload, payload changed' % (self.__class__.__name__, self.pk_obj)) 540 return False 541 542 if type(self.pk_obj) == types.DictType: 543 arg = self.pk_obj 544 else: 545 arg = [self.pk_obj] 546 rows, self._idx = gmPG2.run_ro_queries ( 547 queries = [{'cmd': self.__class__._cmd_fetch_payload, 'args': arg}], 548 get_col_idx = True 549 ) 550 if len(rows) == 0: 551 _log.error('[%s:%s]: no such instance' % (self.__class__.__name__, self.pk_obj)) 552 return False 553 self._payload = rows[0] 554 return True
555 #--------------------------------------------------------
556 - def __noop(self):
557 pass
558 #--------------------------------------------------------
559 - def save(self, conn=None):
560 return self.save_payload(conn = conn)
561 #--------------------------------------------------------
562 - def save_payload(self, conn=None):
563 """Store updated values (if any) in database. 564 565 Optionally accepts a pre-existing connection 566 - returns a tuple (<True|False>, <data>) 567 - True: success 568 - False: an error occurred 569 * data is (error, message) 570 * for error meanings see gmPG2.run_rw_queries() 571 """ 572 if not self._is_modified: 573 return (True, None) 574 575 args = {} 576 for field in self._idx.keys(): 577 args[field] = self._payload[self._idx[field]] 578 self.modified_payload = args 579 580 close_conn = self.__noop 581 if conn is None: 582 conn = gmPG2.get_connection(readonly=False) 583 close_conn = conn.close 584 585 queries = [] 586 for query in self.__class__._cmds_store_payload: 587 queries.append({'cmd': query, 'args': args}) 588 rows, idx = gmPG2.run_rw_queries ( 589 link_obj = conn, 590 queries = queries, 591 return_data = True, 592 get_col_idx = True 593 ) 594 595 # this can happen if: 596 # - someone else updated the row so XMIN does not match anymore 597 # - the PK went away (rows were deleted from under us) 598 # - another WHERE condition of the UPDATE did not produce any rows to update 599 # - savepoints are used since subtransactions may relevantly change the xmin/xmax ... 600 if len(rows) == 0: 601 return (False, (u'cannot update row', _('[%s:%s]: row not updated (nothing returned), row in use ?') % (self.__class__.__name__, self.pk_obj))) 602 603 # update cached values from should-be-first-and-only result 604 # row of last query, 605 # update all fields returned such that computed 606 # columns see their new values 607 row = rows[0] 608 for key in idx: 609 try: 610 self._payload[self._idx[key]] = row[idx[key]] 611 except KeyError: 612 conn.rollback() 613 close_conn() 614 _log.error('[%s:%s]: cannot update instance, XMIN refetch key mismatch on [%s]' % (self.__class__.__name__, self.pk_obj, key)) 615 _log.error('payload keys: %s' % str(self._idx)) 616 _log.error('XMIN refetch keys: %s' % str(idx)) 617 _log.error(args) 618 raise 619 620 conn.commit() 621 close_conn() 622 623 self._is_modified = False 624 # update to new "original" payload 625 self.original_payload = {} 626 for field in self._idx.keys(): 627 self.original_payload[field] = self._payload[self._idx[field]] 628 629 return (True, None)
630 631 #============================================================
632 -def jsonclasshintify(obj):
633 # this should eventually be somewhere else 634 """ turn the data into a list of dicts, adding "class hints". 635 all objects get turned into dictionaries which the other end 636 will interpret as "object", via the __jsonclass__ hint, 637 as specified by the JSONRPC protocol standard. 638 """ 639 if isinstance(obj, list): 640 return map(jsonclasshintify, obj) 641 elif isinstance(obj, gmPG2.dbapi.tz.FixedOffsetTimezone): 642 # this will get decoded as "from jsonobjproxy import {clsname}" 643 # at the remote (client) end. 644 res = {'__jsonclass__': ["jsonobjproxy.FixedOffsetTimezone"]} 645 res['name'] = obj._name 646 res['offset'] = jsonclasshintify(obj._offset) 647 return res 648 elif isinstance(obj, datetime.timedelta): 649 # this will get decoded as "from jsonobjproxy import {clsname}" 650 # at the remote (client) end. 651 res = {'__jsonclass__': ["jsonobjproxy.TimeDelta"]} 652 res['days'] = obj.days 653 res['seconds'] = obj.seconds 654 res['microseconds'] = obj.microseconds 655 return res 656 elif isinstance(obj, datetime.time): 657 # this will get decoded as "from jsonobjproxy import {clsname}" 658 # at the remote (client) end. 659 res = {'__jsonclass__': ["jsonobjproxy.Time"]} 660 res['hour'] = obj.hour 661 res['minute'] = obj.minute 662 res['second'] = obj.second 663 res['microsecond'] = obj.microsecond 664 res['tzinfo'] = jsonclasshintify(obj.tzinfo) 665 return res 666 elif isinstance(obj, datetime.datetime): 667 # this will get decoded as "from jsonobjproxy import {clsname}" 668 # at the remote (client) end. 669 res = {'__jsonclass__': ["jsonobjproxy.DateTime"]} 670 res['year'] = obj.year 671 res['month'] = obj.month 672 res['day'] = obj.day 673 res['hour'] = obj.hour 674 res['minute'] = obj.minute 675 res['second'] = obj.second 676 res['microsecond'] = obj.microsecond 677 res['tzinfo'] = jsonclasshintify(obj.tzinfo) 678 return res 679 elif isinstance(obj, cBusinessDBObject): 680 # this will get decoded as "from jsonobjproxy import {clsname}" 681 # at the remote (client) end. 682 res = {'__jsonclass__': ["jsonobjproxy.%s" % obj.__class__.__name__]} 683 for k in obj.get_fields(): 684 t = jsonclasshintify(obj[k]) 685 res[k] = t 686 print "props", res, dir(obj) 687 for attribute in dir(obj): 688 if not attribute.startswith("get_"): 689 continue 690 k = attribute[4:] 691 if res.has_key(k): 692 continue 693 getter = getattr(obj, attribute, None) 694 if callable(getter): 695 res[k] = jsonclasshintify(getter()) 696 return res 697 return obj
698 699 #============================================================ 700 if __name__ == '__main__': 701 702 if len(sys.argv) < 2: 703 sys.exit() 704 705 if sys.argv[1] != u'test': 706 sys.exit() 707 708 #--------------------------------------------------------
709 - class cTestObj(cBusinessDBObject):
710 _cmd_fetch_payload = None 711 _cmds_store_payload = None 712 _updatable_fields = [] 713 #----------------------------------------------------
714 - def get_something(self):
715 pass
716 #----------------------------------------------------
717 - def set_something(self):
718 pass
719 #-------------------------------------------------------- 720 from Gnumed.pycommon import gmI18N 721 gmI18N.activate_locale() 722 gmI18N.install_domain() 723 724 data = { 725 'pk_field': 'bogus_pk', 726 'idx': {'bogus_pk': 0, 'bogus_field': 1, 'bogus_date': 2}, 727 'data': [-1, 'bogus_data', datetime.datetime.now()] 728 } 729 obj = cTestObj(row=data) 730 #print obj['wrong_field'] 731 #print jsonclasshintify(obj) 732 #obj['wrong_field'] = 1 733 print obj.fields_as_dict() 734 735 #============================================================ 736