CGI Build Process
This page explains the process we use for building and releasing our CGIs. This is done on a three-week schedule.
- Before week1, the source code is called to be in "preview1" state.
- During the first week, any changes by QA or developers are added to the tree
- After week1, the source code is called to be in "preview2" state.
- During the 2nd week, just like in week1, any changes by QA or developers are added to the tree
- After week 2, the final build is compiled and copied to a sandbox
- During the third week development continues, all changes are added and compiled on genome-test, but only bugfixes (build patches) are added to the final build sandbox ("git cherry-pick").
- After week 3, the now bugfixed final build from week two is copied from its sandbox to the public site.
The build after week 2 is built into a sandboxes which is located here:
hgwdev:/usr/local/apache/cgi-bin hgwdev:/usr/local/apache/htdocs-beta
Older hgwdev builds are periodically relocated here:
/hive/groups/browser/build
Setting Up the Environment for the Build
NOTE: The actions in this section are a one-time only set up performed by the new "build-meister".
Becoming the Build-Meister
All build scripts are now run by the "build
" user. This user should already have it's environment properly configured. However, the build-meister will need to be able to log in (though ssh) as the build
user, and the build
user will need to know where to send mail.
- Set up .ssh/authorized_keys so that you can log in as the
build
user. Seek assistance from cluster-admin if you need it.
- Assign the
build
user's BUILDMEISTER environmental variable to equal the user name of the new build meister
hgwdev> ssh -X build@hgwdev # the optional '-X' allows X-windows support <build@hgwdev> edit .tcshrc # use your preferred editor # alter the following line: < setenv BUILDMEISTER tdreszer > setenv BUILDMEISTER chinli
Remember you will need to log out and log back in for the changes to take affect.
- Make sure the
build
user's cron jobs send you mail. Unlike the various build scripts which can use theBUILD_MEISTER
environmental variable to find you, cron runs without access to that variable. Instead, you should add yourself to cron'sMAILTO
variable:
hgwdev> ssh -X build@hgwdev # the optional '-X' allows X-windows support <build@hgwdev> crontab -l > cron.txt <build@hgwdev> edit cron.txt # use your preferred editor < MAILTO=rhead,tdreszer,braney,cricket > MAILTO=rhead,chinli,braney,cricket # and < MAILTO=tdreszer,braney > MAILTO=chinli,braney <build@hgwdev> crontab cron.txt <build@hgwdev> crontab -l # verify what you have done.
Optionally set up your own build environment
NOTE: The remainder of this section is historical. Nevertheless, it is worth keeping these details here, especially if the build-meister wishes to try experimental changes under their own identity.
- Before running build scripts as yourself, you will need to set up the following in your log in file:
hgwdev> cd ~ # go to your home directory as yourself. hgwdev> edit .tcshrc # use your preferred editor # add (or update) the following lines: > umask 002 > source /cluster/bin/build/scripts/buildEnv.csh # To be able to run the Java robot programs, add the following to the top of your path setting: > set path = ( /usr/java/default/bin $path ... ) # optionally add helper aliases: > # wb gets you to the scripts dir. > alias wb 'cd $WEEKLYBLD' > # cd $hg gets you to the latest build sandbox > if ( "$HOST" == "hgwdev" ) then > setenv hg $BUILDDIR/v${BRANCHNN}_branch/kent/src/hg > endif
Remember you will need to log out and log back in for the changes to take affect.
NOTE: For those more comfortable with other shells (e.g. bash), it should be possible to run build scripts from another shell. However, the main limitation is the buildEnv.csh
file which is edited and checked in every week, then sourced by .tcshrc
. Without changes it cannot be sourced by .bashrc
.
- Set up autologin among the general cluster machines
# On your local cse box (i.e. screech, pfft, whatever) screech> ssh-keygen -t dsa (use enter for all defaults) screech> cd ~/.ssh, # add yourself to the authorized keys screech> cp id_dsa.pub authorized_keys
Also put these in your home directory on hgwdev
:
screech> scp -r .ssh/ hgwdev: # Permissions on .ssh should be 700. # Permissions on files in .ssh/ should be 600 or 640.
- Set up autologin for hgdownload and hgdownload-sd by copying your public key to the list of authorized keys on those machines. You may need assistance from someone already authorized to login to hgdownload and hgdownload-sd:
hgwdev> edit ~/.ssh/id_dsa.pub # copy the public key into the clipboard and then log into hgdownload as user qateam hgwdev> ssh qateam@hgdownload hgdownload> cd ~/.ssh hgdownload> edit authorized_keys # paste the key to the authorized_keys file
- You will also need a copy of .hg.conf.beta in your $HOME directory. This should be obtained from /cluster/home/build/.hg.conf.beta.
- Build Symlinks. These are critical for building 64 bit utilities
hgwdev> cd ~/bin # Make sure you have $MACHTYPE directories hgwdev> mkdir x86_64 # Create a symlink for each $MACHTYPE hgwdev> ln -s /cluster/bin/x86_64 x86_64.cluster
The symtrick.csh
uses these automatically. If a script crashes and leaves the symlinks in an incorrect state, use unsymtrick.csh
to restore. Build scripts check to see if unsymtrick.csh
should be executed.
Preview1 Day Build : Day 1
This is day 1 in the schedule.
Run Git Reports
- Connect as "
build
" tohgwdev
. Then go to the weekly build dir on dev:
hgwdev> ssh -X build@hgwdev # the optional '-X' allows X-windows support <build@hgwdev> cd $WEEKLYBLD
- make sure you are on master branch
<build@hgwdev> git checkout master
- edit buildEnv.csh: change the 5th line then the 4th line
<build@hgwdev> edit buildEnv.csh # use your preferred editor < setenv LASTREVIEWDAY 2012-06-12 # v269 preview > setenv LASTREVIEWDAY 2012-07-03 # v270 preview # and < setenv REVIEWDAY 2012-07-03 # v270 preview > setenv REVIEWDAY 2012-07-24 # v271 preview
- re-source buildEnv.csh and check that vars are correct
<build@hgwdev> source buildEnv.csh # or just restart your shell windows <build@hgwdev> env | egrep "VIEWDAY"
- commit the changes to this file to Git:
<build@hgwdev> git pull <build@hgwdev> @ NEXTNN = ( $BRANCHNN + 1 ) ; git commit -m "v$NEXTNN preview1" buildEnv.csh <build@hgwdev> git push
- run doNewReview.csh
<build@hgwdev> screen # use screen if you wish <build@hgwdev> ./doNewReview.csh # review the variables
- run for real and direct output to a log file (this takes about 2 minutes - it runs git reports by ssh'ing to hgwdev)
<build@hgwdev> time ./doNewReview.csh real >& logs/v${NEXTNN}.doNewRev.log <build@hgwdev> ctrl-a, d # to detach from screen <build@hgwdev> tail -f doNewRev.log # see what happens
Check the reports
- The reports are automatically built by the script into this location.
Briefly review the reports quickly as a sanity check.
Generate review pairings
(Ann takes care of this)
- Assign code-review partners in redmine.
Preview2 Day Build : Day 8
This is day 8 in the schedule.
Run Git Reports
- Connect as "
build
" tohgwdev
. Then go to the weekly build dir on dev
hgwdev> ssh -X build@hgwdev # the optional '-X' allows X-windows support <build@hgwdev> cd $WEEKLYBLD
- make sure you are on master branch
<build@hgwdev> git checkout master # or just: git branch
- edit buildEnv.csh: change the 7th line then the 6th line
<build@hgwdev> edit buildEnv.csh # use your preferred editor < setenv LASTREVIEW2DAY 2012-06-19 # v269 preview2 > setenv LASTREVIEW2DAY 2012-07-10 # v270 preview2 # and < setenv REVIEW2DAY 2012-07-10 # v270 preview2 > setenv REVIEW2DAY 2012-07-31 # v271 preview2
- re-source buildEnv.csh and check that vars are correct
<build@hgwdev> source buildEnv.csh # or just restart your shell windows <build@hgwdev> env | grep 2DAY
- commit the changes to this file to Git:
<build@hgwdev> git pull; @ NEXTNN = ( $BRANCHNN + 1 ) ; git commit -m "v$NEXTNN preview2" buildEnv.csh <build@hgwdev> git push
- run doNewReview2.csh
<build@hgwdev> screen # use screen if you wish <build@hgwdev> ./doNewReview2.csh # review the variables
- run for real and direct output to a log file (this takes about 2 minutes - it runs git reports by ssh'ing to hgwdev)
<build@hgwdev> ./doNewReview2.csh real >& logs/v$BRANCHNN.doNewRev2.log <build@hgwdev> ctrl-a, d # to detach from screen <build@hgwdev> tail -f doNewRev2.$BRANCHNN.log # see what happens
Check the reports
- The reports are automatically built by the script into this location.
Briefly review the reports quickly as a sanity check.
Generate review pairings
(Ann takes care of this)
- Assign code-review partners in redmine.
Final Build : Day 15
This is day 15 in the schedule.
Do the Build
- Connect as "
build
" tohgwdev
. Then go to the weekly build dir on dev
hgwdev> ssh -X build@hgwdev # the optional '-X' allows X-windows support <build@hgwdev> cd $WEEKLYBLD
- make sure you are on master branch and there is no uncommitted script change
<build@hgwdev> git checkout master <build@hgwdev> git status
- edit the buildEnv.csh file
<build@hgwdev> edit buildEnv.csh # use your preferred editor < setenv LASTWEEK 2012-06-26 # v269 final > setenv LASTWEEK 2012-07-17 # v270 final # and < setenv TODAY 2012-07-17 # v270 final > setenv TODAY 2012-08-07 # v271 final # and the big one: < setenv BRANCHNN 270 > setenv BRANCHNN 271
- re-source buildEnv.csh and check that vars are correct
<build@hgwdev> source buildEnv.csh # or just restart your shell windows <build@hgwdev> env | egrep "DAY|NN|WEEK"
- commit the changes to this file to Git:
<build@hgwdev> git pull; git commit -m "v$BRANCHNN final build" buildEnv.csh <build@hgwdev> git push
- run doNewBranch.csh
<build@hgwdev> screen # NOTE: screen is recommended this time! <build@hgwdev> ./doNewBranch.csh # review the variables
- run for real, send the output to a file and review while it is written (takes ~1 hour)
<build@hgwdev> ./doNewBranch.csh real >& logs/v${BRANCHNN}.doNewBranch.log <build@hgwdev> ctrl-a, d # to detach from the screen <build@hgwdev> tail -f logs/v${BRANCHNN}.doNewBranch.log # follow what happens
- look for files that tell you it was successful (script will report whether these files were created):
<build@hgwdev> ls -l /cluster/bin/build/scripts/GitReports.ok
- Check timestamp of CGIs in hgwdev:/usr/local/apache/cgi-bin and the version number in the browser title header.
- If you get errors it might be because the script is wrong, rather than there actually be an error. For example, to check for errors, the 'make' log file is grepped for 'error|warn' so any new 'C' file with error or warn in its name will show up as an error whether or not it compiled cleanly. You might need to change the script to remove references to files like this, eg edit buildBeta.csh to ignore references to files like gbWarn.c and gbWarn.o in the log:
<build@hgwdev> edit buildBeta.csh ... make beta >& make.beta.log # These flags and programs will trip the error detection sed -i -e "s/-DJK_WARN//g" make.beta.log sed -i -e "s/-Werror//g" make.beta.log #-- report any compiler warnings, fix any errors (shouldn't be any) #-- to check for errors: set res = `/bin/egrep -i "error|warn" make.beta.log | /bin/grep -v "gbWarn.o -c gbWarn.c" | /bin/grep -v "gbExtFile.o gbWarn.o gbMiscDiff.o"`
- What the doNewBranch.csh script does:
- edits the versionInfo.h file
- makes tags (takes 1 minute)
- builds Git reports (takes 1 minute)
- does build (takes 5-10 minutes)
- builds utils (of secondary importance)
- builds CGIs (most important)
Run the Robots
- [build-meister] run doRobots.csh, and watch the log if you are interested (most log messages go to the logs/ dir mentioned below)
<build@hgwdev> screen # startup a new screen <build@hgwdev> ./doRobots.csh >& logs/v$BRANCHNN.robots.log <build@hgwdev> ctrl-a, d # detach from screen <build@hgwdev> tail -f logs/v$BRANCHNN.robots.log
- What the doRobots.csh script does:
- runs robots one at a time
- hgNear (20 min)
- hgTables (several hours)
- TrackCheck (several hours)
- LiftOverTest (quick)
- [push shepherd] Review the error logs for the robots:
error logs located here: hgwdev:/cluster/bin/build/scripts/logs
- hgNear -- sends email with results
- hgTables -- send email with results
- TrackCheck -- must check by hand: grep -i "error" logs/TrackCheck-v$BRANCHNN.log (TrackCheck person does this)
- LiftOverTest -- must check by hand: cat logs/LiftOverTest-v$BRANCHNN.log
NOTE: These robot tests take more than 6 hours, do not wait for them, do the rest of the steps like GBIB.
Genome Browser in a Box
- The
build
account onhgwdev
operates this procedure - Verify previous build was correctly archived, should be a copy of
/usr/local/apache/htdocs/gbib/gbibBeta.zip into /hive/groups/browser/vBox/gbibV{previousN}.zip ls -ogrt /usr/local/apache/htdocs/gbib/gbibBeta.zip /hive/groups/browser/vBox/gbibV*.zip
If that doesn't exist, copy it over, adjust {previousN}:
cp -p /usr/local/apache/htdocs/gbib/gbibBeta.zip /hive/groups/browser/vBox/gbibV{previousN}.zip
- Start the GBiB VM "browserbox":
<build@hgwdev> boxStart
If the VM is already running, this will fail with "The machine 'browserbox' is already locked for a session". Just ignore the message.
- Run the update process on GBiB so that it grabs 1) the newly compiled CGIs from
cgi-bin-beta/
2) htdocs fromhtdocs-beta/
and 3) some additional files. BEWARE when the box starts up, it will begin updating itself from hgdownload since it has been a couple of weeks since it has run. When you login, run a ps -ef | grep rsync to verify the existing rsync is completed. You do not need to interrupt it, just wait for it to finish, then run the updateBrowser command:
<build@hgwdev> ssh box browser@browserbox:~$ updateBrowser hgwdev hiram beta
You will need to type your hgwdev password three times.
Type exit when done to return to hgwdev.
- Package up the new version of gbib.zip and the "push" directory
<build@hgwdev> time ./boxRelease.csh beta >& logs/v${BRANCHNN}.boxRelease.log (about 20 minutes)
- Archive this copy of gBiB:
cp -p /usr/local/apache/htdocs/gbib/gbibBeta.zip /hive/groups/browser/vBox/gbibV${BRANCHNN}.zip
- I have never tested this single buildGbibAndZip script, I run the above commands. These three commands have been automated as a single script called "buildGbibAndZip beta" that can be run under the build user. No need for typing any passwords several times. The script contains some strange public key trickery to make this work.
Restart qateam beta gBiB
- Login to qateam account on hgwdev, review running or not VMS:
VBoxManage list runningvms # lists running VMs VBoxManage list vms # lists all registered VMs
- Stop the currently running beta box:
boxBetaStop # this runs the command: VBoxManage controlvm browserboxbeta acpipowerbutton
- de-register this existing beta VM:
VBoxManage unregistervm browserboxbeta
- archive this existing beta VM, examine the directory /cluster/home/qateam/VirtualBox VMs/ to see the pattern:
cd VirtualBox\ VMs mv browserboxbeta browserboxbeta.vNNN
- unpack the new gBiB version
mkdir browserboxbeta cd browserboxbeta unzip /usr/local/apache/htdocs/gbib/gbibBeta.zip
- prepare this image to register as a new browserboxbeta VM:
sed -e 's/browserbox/browserboxbeta/; s/1234/1236/; s/1235/1237/;' browserbox.vbox > browserboxbeta.vbox
- register this new browserboxbeta image:
cd $HOME VBoxManage registervm `pwd`/"VirtualBox VMs/browserboxbeta/browserboxbeta.vbox"
- restart the box with 'nice -19'
boxBetaStart # this runs the command: nice -n +19 VBoxHeadless -s browserboxbeta &
- test WEB server and login account:
wget http://localhost:1236/index.html -O /dev/stdout ssh boxBeta # use password 'browser' to login - this uses .ssh/config to get into the correct port
Check the reports
- The reports are automatically built by the script into this location.
- Briefly review the reports quickly as a sanity check.
Generate the code summaries and review pairings
(Ann takes care of this)
- Assign code-review partners in redmine.
- Summarize the code changes that were committed during the past week. Solicit input from the engineers.
- Update these pages with the summary:
- Send an email to browser-staff with links to the summaries.
Test on hgwdev
- Wait to hear from QA about how their CGIs look on hgwbeta. QA members should update the CGI build chatter ticket in Redmine with a "done testing" message or, if applicable, "not following issues for this release" message. Each member of the QA team has testing responsiblities.
Make changes to code base as necessary
This happens on days 15, 16, 17, and 18 in the schedule.
- If there are problems with the build a developer will fix the code. This fix needs to be patched into the build on hgwdev. This page explains how to do a Cherry Pick on hgwdev.
Fixing problems in the Build
This usually happens between days 16 and 19.
QA advises buildmeister to cherry pick
- see these instructions.
Push the CGIs
This is day 22 in the schedule.
The day before the push (day 21 in the schedule) send email notice
Send email to all of browser-staff (which includes cluster-admin) letting them know that tomorrow is a push day. Something along these lines:
Just a heads up that tomorrow is a CGI push day. If you have big code changes included in this release please be available in case something goes wrong with the push of your changes. QA typically starts the push around 1:30pm.
Push to hgw0 only
- hgw0 is identical to the RR machines but not actually in the RR (i.e. changes there are not seen by the public).
- QA will send an email to push-request the morning of the push letting the pushers know that today is a CGI push day (this is their notice to be vigilant about pushing quickly).
- QA will ask for push of CGIs from hgwdev to hgw0 only. If there is a NEW CGI or file(s) going out this week, be sure to make a prominent note of it in your push request. The admins push from a script, and they will need to add your new CGI to the script. (the build-meister should not be cc'd on this email.)
As of January 2015, here's a list of the CGIs and data files we push. Note: CGIs and data files may have been added since this list was created -- this is meant to be a starting point.
cartDump cartReset das hgApi hgBeacon hgBlat hgc hgConvert hgCustom hgEncodeApi hgEncodeDataVersions hgEncodeVocab hgFileSearch hgFileUi hgGateway hgGene hgGenome hgHubConnect hgLiftOver hgLogin hgMirror hgNear hgPal hgPcr hgRenderTracks hgSession hgSuggest hgTables hgTracks hgTrackUi hgUserSuggestion hgVai hgVisiGene phyloGif
and these configuration files:
/usr/local/apache/cgi-bin/all.joiner /usr/local/apache/cgi-bin/greatData/* /usr/local/apache/cgi-bin/hgCgiData/* /usr/local/apache/cgi-bin/hgGeneData/* /usr/local/apache/cgi-bin/hgNearData/* /usr/local/apache/cgi-bin/hgcData/* /usr/local/apache/cgi-bin/loader/* /usr/local/apache/cgi-bin/lsSnpPdbChimera.py /usr/local/apache/cgi-bin/utils/* /usr/local/apache/cgi-bin/visiGeneData/*
For these directories we request an rsync --delete (from hgwbeta to the RR)
/usr/local/apache/htdocs/js/* /usr/local/apache/htdocs/style/*
- Run TrackCheck on hgw0. This is the responsibility of the QA person who tests hgTracks.
- make a props file which specifies the machine/db to check. Set zoomCount=1 and it will only check the default position for each assembly. Example props file for hgw0:
machine mysqlbeta.soe.ucsc.edu #This is where TrackCheck checks for active databases (active=1). These databases maynot be on the the RR and it will give errors which can be ignore. server hgw0.soe.ucsc.edu # this is the machine that you are testing. quick false dbSpec all #You can list just one database here. table all #You can list one table if need be. zoomCount 1 # if number is greater than one it will check links at higher zoom levels.
- run it from hgwdev: nohup TrackCheck hgw0.props > & $WEEKLYBLD/logs/TrackCheck-hgw0.07-13-2006
- run in the background if desired by typing Ctrl-Z then "bg", to check status type "jobs" or "ps -ef | grep TrackCheck".
- examine the file for errors.
- Monitor Apache Error Log (QA does this) see examples here:
hgw0:/usr/local/apache/logs/error_log
To watch the log without line wraps, type "less -S error_log." Typing capital "F" will all you to follow incoming errors. When errors arrise you can type Ctrl-C and use the right arrow to scroll the window over to see the entire message. *Update*: To view the error log *without* Hiram's CGI_TIME entries (for background info see: http://redmine.soe.ucsc.edu/issues/10081):
$ tail -f error_log | grep -v CGI_TIME
- Wait to hear from QA about how their CGIs look on hgw0. Each member of the QA team has testing responsiblities. Check also that TrackCheck ran successfully.
Push to hgwN only
- hgwN is one of the RR machines, hgw1-6. Each build, rotate to the next machine in numeric order i.e. hgw1 then hgw2 etc. so that one machine is not being worked more than the others.
- Once the new CGIs are on hgwN the push shepherd will watch the error logs for a short while to make sure no new errors occur under load.
Push to the rest of the RR and hgwbeta-public and euronode
- QA will ask for push from hgwbeta to the rest of the hgwN machines, as well as hgwbeta-public and euronode. The js and style directory files should also go to /usr/local/apache/htdocs/js-public/* (or style-public/*) on hgwbeta ONLY in order to keep the javascript the same on the RR and hgwbeta-public. So, in addition to asking for the rsync --delete of the directories from hgwbeta to the RR machines, we also need to ask for an rsync --delete:
from /usr/local/apache/htdocs/js/* /usr/local/apache/htdocs/style/* (on hgwbeta) to /usr/local/apache/htdocs/js-public/* /usr/local/apache/htdocs/style-public/* (on hgwbeta ONLY)
- QA will send email to the build-meister to let him/her know that the CGIs are on the RR.
Remember to keep track of new features
Anyone can add to this list at any time, but if no notes for this release have been made on the new features page, now is a good time to add some.
Final Build Wrap-up
This is day 23 in the schedule.
The buildmeister should do these steps once QA has notified you that all RR machines have been updated.
Normally this is run once at the end of the cycle. However, occassionally it is necessary to patch a build after it is already released on the RR. Depending upon the extent of the patch, it may be desirable or even necessary to rerun the wrap-up. All of these scripts can be safely rerun into the next build is made (until BRANCHNN is updated in buildEnv.csh).
- Connect as "
build
" onhgwdev
this time. Then go to the weekly build dir on dev
hgwdev> ssh -X build@hgwdev # the optional '-X' allows X-windows support <build@hgwdev> cd $WEEKLYBLD
- build and push hgcentral IF there are any changes
<build@hgwdev> ./buildHgCentralSql.csh <build@hgwdev> ./buildHgCentralSql.csh real >& logs/v${BRANCHNN}.buildHgCentralSql.log
- check that the hgcentral.sql has been updated: http://hgdownload.soe.ucsc.edu/admin/ and http://hgdownload-sd.soe.ucsc.edu/admin/
- Now connect to
hgwdev
.
hgwdev> ssh -X build@hgwdev # the optional '-X' allows X-windows support <build@hgwdev> cd $WEEKLYBLD <build@hgwdev> screen # if desired
- build 'userApps' target (various utilities) on hgwdev and scp them to hgdownload and hgdownload-sd
<build@hgwdev> time ./doHgDownloadUtils.csh >& logs/v${BRANCHNN}.doHgDownloadUtils.log
- check that the utils version is right: http://hgdownload.soe.ucsc.edu/admin/exe and http://hgdownload-sd.soe.ucsc.edu/admin/exe
- update the beta tag to match the release:
<build@hgwdev> cd $WEEKLYBLD <build@hgwdev> env # (just to make sure it looks right) <build@hgwdev> ./tagBeta.csh <build@hgwdev> ./tagBeta.csh real >&logs/v${BRANCHNN}.tagBeta.log
- tag the official release
<build@hgwdev> cd $WEEKLYBLD <build@hgwdev> git fetch <build@hgwdev> git tag | grep "v${BRANCHNN}_branch" # Note: use .1 or .2 or whatever is the next unused subversion number <build@hgwdev> git push origin origin/v${BRANCHNN}_branch:refs/tags/v${BRANCHNN}_branch.1 <build@hgwdev> git fetch
- zip the source code
<build@hgwdev> cd $WEEKLYBLD <build@hgwdev> time ./doZip.csh >&logs/v${BRANCHNN}.doZip.log # (this is automatically pushed to hgdownload)
- check that the source file version is right: http://hgdownload.soe.ucsc.edu/admin/ and http://hgdownload-sd.soe.ucsc.edu/admin/
- WAIT 10 minutes, then run ./userApps.sh to package up the userApps/ directory with its source and pushes it to hgdownload and hgdownload-sd htdocs/admin/exe/
time ./userApps.sh >& logs/v${BRANCHNN}.userApps.log
- request push to hgdownload and the genome browser store from hgwdev
The gbib.zip archive goes from hgwdev
/usr/local/apache/htdocs/gbib/gbibBeta.zip
To the following directory on genome-store
/var/www/browserShop/media/products/gbib.zip
And the incremental push updates go from hgwdev
/usr/local/apache/htdocs/gbib/push/
To hgdownload in the directory /mirrordata/gbib/
/mirrordata/gbib/push/
- WAIT a day for the nightly rsync to happen from the RR for cgi-bin/ and htdocs/ hierarchies to hgdownload
- confirm cgi-bin/ and htdocs/ on hgdownload are up to date, then send email to genome-mirror@soe.ucsc.edu.
Include this link to latest source: http://hgdownload.soe.ucsc.edu/admin/jksrc.zip. Use the last email as a template (see https://www.soe.ucsc.edu/pipermail/genome-mirror). If you push the hgcentral.sql, make sure to mention this has also changed in the email.
Example:
To: genome-mirror@soe.ucsc.edu Subject: v292 Genome Browser Available Good Afternoon Genome Browser Mirror Site Operators: The version v292 source is now available at: http://hgdownload.soe.ucsc.edu/admin/jksrc.zip or labelled with source number: http://hgdownload.soe.ucsc.edu/admin/jksrc.v292.zip The version v292 CGI binaries can be found at: rsync -avP rsync://hgdownload.cse.ucsc.edu/cgi-bin/ ${WEBROOT}/cgi-bin/ or: ftp://hgdownload.cse.ucsc.edu/apache/cgi-bin/ A license is required for commercial download and/or installation of the Genome Browser binaries and source code. No license is needed for academic, nonprofit, and personal use. Summaries of changes can be found here: http://genecats.soe.ucsc.edu/builds/versions.html The following CGIs were updated: cartDump cartReset das hgApi hgBlat hgConvert hgCustom hgEncodeApi hgEncodeDataVersions hgEncodeVocab hgFileSearch hgFileUi hgGateway hgGene hgGenome hgHubConnect hgLiftOver hgLogin hgNear hgPal hgPcr hgRenderTracks hgSession hgSuggest hgTables hgTrackUi hgTracks hgUserSuggestion hgVai hgVisiGene hgc phyloGif and these configuration files: /usr/local/apache/cgi-bin/all.joiner /usr/local/apache/cgi-bin/encode/cv.ra /usr/local/apache/cgi-bin/greatData/* /usr/local/apache/cgi-bin/hgCgiData/* /usr/local/apache/cgi-bin/hgGeneData/* /usr/local/apache/cgi-bin/hgNearData/* /usr/local/apache/cgi-bin/hgcData/* /usr/local/apache/cgi-bin/loader/* /usr/local/apache/cgi-bin/lsSnpPdbChimera.py /usr/local/apache/cgi-bin/visiGeneData/* Please rsync --delete these directories: /usr/local/apache/htdocs/js/* /usr/local/apache/htdocs/style/* Please rsync this directory: /usr/local/apache/htdocs/images/* The script in the source tree: src/product/scripts/updateHtml.sh can be used to update your htdocs directory. A new hgcentral.sql file is now be present at: http://hgdownload.cse.ucsc.edu/admin/ If you have any questions or concerns, please feel free to write back to this mail list. Thanks, {buildmeister}