Other aop/tasks/database

Table Of Contents

Previous topic

Subversion/Trac/Autobuild/test Support

Next topic

Channel Quality DB Maintenance

This Page

Daya Bay Links

Content Skeleton

Offline Database Support Tasks


Generally the routine operations take little time, but expertise needs to be developed in order to be able to react quickly to error conditions. The most time consuming task is maintaining near continuous operation of a remote monitoring node and acting on monitoring emails that it sends.

Routine Operations

Database Management

  1. Review proposed new table .spec
  2. Review DB writing Scraper code


  1. Maintain SOP documentation generation and sources
SOP is built as part of the OfflineUserManual (OUM),
hence also need to maintain OUM documentation generation.

Note that latex sources are converted into RST at every build by a cnv.py invoked from the Makefile. Incompatible latex changes have broken this conversion in the past, easiest solution is to find the latex change in the docs and modify it to correspond with the latex subset understood by the converter.

Maintain SOP tools

The SOP is based upon several scripts, including:

No significant work is expected to maintain these tools however an expert is needed to be aware of their operation and able to assist in their usage and fix issues that might arise.

Other tools

Many other tools have be created, that are more for expert usage, such as:

  • dbsrv.py provides partitioned backups in SEQNO chunks, allowing fast archive/transfer/restore operations on very large tables such as those in the Channel Quality DB. Used from daily cron script on dybdb2 to backup the CQDB.

SOP Policing

Verify that DB users are following the SOP, remind them when they forget, eg:

DBI/DybDbi Tech Support

Provide advice/help on:

  • custom DB scripts, Pedro/Gaosong for CWG and DQWG are the usual customers
  • nosetests for checking DB updates
  • DybDbi usage techniques

Custom Scripts Advice/Help

The SOP stipulates that all writing to DBI Databases should be done by DBI or DybDbi, to avoid incorrect faking of what DBI expects. Reading should be done by the most convenient approach, often reading with DybPython.DB (mysql-python based) and writing with DybDbi is a good approach to take.

This avoids the problem of handling multiple DBI connections simulataneously.

Custom Operations

On rare occasions it is expedient to perform DB operations without following SOP approaches. For example when:

  1. jumpstarting large or expensive to create tables such as the DcsAdWpHv table
  2. fixing bugs in scraped tables, eg the HV time shunt problem
  3. fixing database corruption

Simple incorrect calibrations are insufficient cause to suffer the effort and risk of developing, testing and performing custom operations.

When performing custom operations, tables are often communicated via mysqldump files. Tools to handle these are documented

The approach taken to perform custom operations is:

  1. develop and test python scripts that perform the operation
  2. test these scripts on full copies of the relevant databases
  3. document the usage of the scripts
  4. train Qiumei/Liang at IHEP on how to first test then perform the operations and then ask them to proceed

Even simple fixes are handled in this laborious manner.

Remote NuWa monitor node

Maintain remote NuWa monitor node that performs daily Database Update and Replication tests

  • The remote NuWa node (currently belle7.nuu.edu.tw), performs dybinst based cron monitoring tasks daily. As this requires a recent NuWa installation and benefits from easy updating, it makes sense for one of the slave nodes to perform this duty..
  • offline_db and dcsdb(IHEP mirror) are checked for table updates
  • replication along the offline_db chain is checked by comparing updates along the chain
  • irregularities result in nosetest failures and the sending of notification emails.
    • requests to Scraper operation experts (Liang) and DCS experts (Mei) to investigate abnormalities
  • Details: DbiMonitor package : cron invoked nosetests

Tour of dybgaudi/Database


Exceptional Tasks

Finding solutions to exceptional problems benefits greatly from a willingness to develop MySQL expertise. For example using group by querying and group_concat to construct python dict strings proved to be game changer when dealing with the large channelquality_db tables.

Scraper additions

Adding new scrapers requires familiarity with the scraper framework in order to advise table experts on appropriate implementations and testing techniques.

Details Scraping source databases into offline_db

DBI/DybDbi/NonDbi debugging

DBI and the various DB interfaces have required little maintenance. Due to the code generation (via django templates) done by DybDbi package builds it is possible for invalid .spec files for new tables to break the build in ways that would be difficult to debug without deep debugging skills or CMT experience.

Database Interventions

Custom DB fixing scripts to deal with issues encountered, using MySQL-python and/or DybDbi as appropriate have been required on many occasions. For example reacting to a bug in a Scraper causing timeshifted entries.

Corruption Recovery

Recovering from channelquality_db corruption (when it was still called tmp_ligs_offline_db) required implementation of new tools to work with large tables) MySQL DB Repair.