This should more properly be titled "Setting up a copy of the database on a local machine".

  1. Extract the schemas from the production server
  2. Create a new database on the local machine
  3. Create the production schemas
  4. Import the quest data into _quest_raw. This implies that you have access to the data, of course. Isaac has a script in the database, ps_load_data(), that loads the data extracts from a hard-coded location. Byron still finds it easier to use his own scala-based script. A few transformations need to be made first:
    1. #!/bin/bash

      source $(dirname $0)/source

      for source in $basedir/data/quest/PS_*.txt; do
      table_name=$(basename $source .txt)
      echo $source $target
      tail -n +2 $source >$target
  5. Use psql to execute "select _quest_raw.ps_correction()" to apply a limited set of corrections to the raw data.
  6. Use psql to execute "select _quest_view.update_tables()" to load the data from _quest_raw into _quest.
  7. Use psql to run create_oat.sql to create the oat schema.
  8. Use psql to run init_schema.sql to initialize the schema with table definitions, etc.
  9. Use psql to run init_data.sql to load a beginning set of data.
The scripts for steps 1-3 and 7-9 are at Steps 1-3 are in teh script

-- ByronWeberBecker - 2013-06-08

Edit | Attach | Watch | Print version | History: r3 < r2 < r1 | Backlinks | Raw View | Raw edit | More topic actions
Topic revision: r3 - 2013-07-11 - ByronWeberBecker
This site is powered by the TWiki collaboration platform Powered by PerlCopyright © 2008-2024 by the contributing authors. All material on this collaboration platform is the property of the contributing authors.
Ideas, requests, problems regarding TWiki? Send feedback