CGI Build Process: Difference between revisions
No edit summary |
|||
(316 intermediate revisions by 17 users not shown) | |||
Line 1: | Line 1: | ||
This page explains the process we use for building and releasing our CGIs. This is done on a three-week [[CGI_Build_Schedule | schedule]]. | This page explains the process we use for building and releasing our CGIs. This is done on a three-week [[CGI_Build_Schedule | schedule]]. | ||
* Before week1, the source code is called to be in "preview1" state. | |||
* During the first week, any changes by QA or developers are added to the tree | |||
* After week1, the source code is called to be in "preview2" state. | |||
* During the 2nd week, just like in week1, any changes by QA or developers are added to the tree | |||
* After week 2, the final build is compiled and copied to a sandbox | |||
* During the third week development continues, all changes are added and compiled on genome-test, but only bugfixes (build patches) are added to the final build sandbox ("git cherry-pick"). | |||
* After week 3, the now bugfixed final build from week two is copied from its sandbox to the public site. | |||
Older | The build after week 2 is built into a sandboxes which is located here: | ||
hgwdev:/usr/local/apache/cgi-bin | |||
hgwdev:/usr/local/apache/htdocs-beta | |||
Older hgwdev builds are periodically relocated here: | |||
/hive/groups/browser/build | /hive/groups/browser/build | ||
Line 17: | Line 24: | ||
* Set up .ssh/authorized_keys so that you can log in as the <code>build</code> user. Seek assistance from cluster-admin if you need it. | * Set up .ssh/authorized_keys so that you can log in as the <code>build</code> user. Seek assistance from cluster-admin if you need it. | ||
* Assign the <code>build</code> user's | * Assign the <code>build</code> user's BUILDMEISTEREMAIL environmental variable to equal the user name of the new build meister | ||
hgwdev> ssh -X build@hgwdev # the optional '-X' allows X-windows support | hgwdev> ssh -X build@hgwdev # the optional '-X' allows X-windows support | ||
<build@hgwdev> edit .tcshrc # use your preferred editor | <build@hgwdev> edit .tcshrc # use your preferred editor | ||
# alter the following line: | # alter the following line: | ||
< setenv | < setenv BUILDMEISTEREMAIL tdreszer | ||
> setenv | > setenv BUILDMEISTEREMAIL chinli | ||
Remember you will need to log out and log back in for the changes to take affect. | Remember you will need to log out and log back in for the changes to take affect. | ||
* As the new build-meister, git commits from the build user should be associated with your email address. You should change the git configuration to use it: | |||
hgwdev> ssh -X build@hgwdev # the optional '-X' allows X-windows support | |||
<build@hgwdev> edit .gitconfig # use your preferred editor | |||
# alter the following line: | |||
< email = oldbuildmeister@soe.ucsc.edu | |||
> email = youremail@soe.ucsc.edu | |||
* Make sure the <code>build</code> user's cron jobs send you mail. Unlike the various build scripts which can use the <code>BUILD_MEISTER</code> environmental variable to find you, cron runs without access to that variable. Instead, you should add yourself to cron's <code>MAILTO</code> variable: | * Make sure the <code>build</code> user's cron jobs send you mail. Unlike the various build scripts which can use the <code>BUILD_MEISTER</code> environmental variable to find you, cron runs without access to that variable. Instead, you should add yourself to cron's <code>MAILTO</code> variable: | ||
Line 56: | Line 70: | ||
> alias wb 'cd $WEEKLYBLD' | > alias wb 'cd $WEEKLYBLD' | ||
> # cd $hg gets you to the latest build sandbox | > # cd $hg gets you to the latest build sandbox | ||
> if | > if ( "$HOST" == "hgwdev" ) then | ||
> setenv hg $BUILDDIR/v${BRANCHNN}_branch/kent/src/hg | > setenv hg $BUILDDIR/v${BRANCHNN}_branch/kent/src/hg | ||
> endif | > endif | ||
Line 65: | Line 79: | ||
* Set up autologin among the general cluster machines | * Set up autologin among the general cluster machines | ||
# On your local | # On your local soe box (i.e. screech, pfft, whatever) | ||
screech> ssh-keygen -t dsa (use enter for all defaults) | screech> ssh-keygen -t dsa (use enter for all defaults) | ||
screech> cd ~/.ssh, | screech> cd ~/.ssh, | ||
Line 76: | Line 90: | ||
# Permissions on files in .ssh/ should be 600 or 640. | # Permissions on files in .ssh/ should be 600 or 640. | ||
* Set up autologin for hgdownload | * Set up autologin for hgdownload by copying your public key to the list of authorized keys on those machines. You may need assistance from someone already authorized to login to hgdownload: | ||
hgwdev> edit ~/.ssh/id_dsa.pub # copy the public key into the clipboard and then log into hgdownload as user qateam | hgwdev> edit ~/.ssh/id_dsa.pub # copy the public key into the clipboard and then log into hgdownload as user qateam | ||
hgwdev> ssh qateam@hgdownload | hgwdev> ssh qateam@hgdownload | ||
Line 84: | Line 98: | ||
* You will also need a copy of .hg.conf.beta in your $HOME directory. This should be obtained from /cluster/home/build/.hg.conf.beta. | * You will also need a copy of .hg.conf.beta in your $HOME directory. This should be obtained from /cluster/home/build/.hg.conf.beta. | ||
* Build Symlinks. These are critical for building | * Build Symlinks. These are critical for building 64 bit utilities | ||
<pre> | <pre> | ||
hgwdev> cd ~/bin | hgwdev> cd ~/bin | ||
# Make sure you have $MACHTYPE directories | # Make sure you have $MACHTYPE directories | ||
hgwdev> mkdir x86_64 | hgwdev> mkdir x86_64 | ||
# Create a symlink for each $MACHTYPE | # Create a symlink for each $MACHTYPE | ||
hgwdev> ln -s /cluster/bin/x86_64 x86_64.cluster | hgwdev> ln -s /cluster/bin/x86_64 x86_64.cluster | ||
</pre> | </pre> | ||
The <code>symtrick.csh</code> uses these automatically. If a script crashes and leaves the symlinks in an incorrect state, use <code>unsymtrick.csh</code> to restore. Build scripts check to see if <code>unsymtrick.csh</code> should be executed. | The <code>symtrick.csh</code> uses these automatically. If a script crashes and leaves the symlinks in an incorrect state, use <code>unsymtrick.csh</code> to restore. Build scripts check to see if <code>unsymtrick.csh</code> should be executed. | ||
<BR><BR> | <BR><BR> | ||
== Preview1 Day Build : Day 1 == | == Preview1 Day Build : Day 1 == | ||
Line 104: | Line 115: | ||
===Run Git Reports=== | ===Run Git Reports=== | ||
* Connect as "<code>build</code>" to <code> | * Connect as "<code>build</code>" to <code>hgwdev</code>. Then go to the weekly build dir on dev: | ||
hgwdev> ssh -X build@ | hgwdev> ssh -X build@hgwdev # the optional '-X' allows X-windows support | ||
<build@ | <build@hgwdev> cd $WEEKLYBLD | ||
* make sure you are on master branch | * make sure you are on master branch | ||
<build@ | <build@hgwdev> screen # use screen if you wish | ||
<build@hgwdev> git checkout master | |||
<build@hgwdev> git pull | |||
* edit buildEnv.csh: change the 5th line then the 4th line | * edit buildEnv.csh: change the 5th line then the 4th line | ||
<build@ | <build@hgwdev> vi buildEnv.csh # use your preferred editor | ||
< setenv LASTREVIEWDAY 2012-06-12 # v269 preview | < setenv LASTREVIEWDAY 2012-06-12 # v269 preview | ||
> setenv LASTREVIEWDAY 2012-07-03 # v270 preview | > setenv LASTREVIEWDAY 2012-07-03 # v270 preview | ||
Line 120: | Line 133: | ||
* re-source buildEnv.csh and check that vars are correct | * re-source buildEnv.csh and check that vars are correct | ||
<build@ | <build@hgwdev> source buildEnv.csh # or just restart your shell windows | ||
<build@ | <build@hgwdev> env | egrep "VIEWDAY" | ||
* commit the changes to this file to Git: | * commit the changes to this file to Git. For TICKETNUM, use the chatter ticket of the previous build: | ||
<build@ | <build@hgwdev> @ NEXTNN = ( $BRANCHNN + 1 ) | ||
<build@ | <build@hgwdev> git commit -m "v$NEXTNN preview1, refs #TICKETNUM" buildEnv.csh | ||
<build@hgwdev> git push | |||
* run doNewReview.csh | * run doNewReview.csh | ||
<build@ | <build@hgwdev> ./doNewReview.csh # review the variables | ||
* run for real and direct output to a log file (this takes about 2 minutes - it runs git reports by ssh'ing to hgwdev) | * run for real and direct output to a log file (this takes about 2 minutes - it runs git reports by ssh'ing to hgwdev) | ||
<build@ | <build@hgwdev> time ./doNewReview.csh real >& logs/v${NEXTNN}.doNewRev.log | ||
<build@ | <build@hgwdev> ctrl-a, d # to detach from screen | ||
<build@ | <build@hgwdev> source buildEnv.csh # just to be sure | ||
<build@hgwdev> @ NEXTNN = ( $BRANCHNN + 1 ) | |||
<build@hgwdev> tail -f logs/v${NEXTNN}.doNewRev.log # see what happens | |||
===Check the reports=== | ===Check the reports=== | ||
* The reports are automatically built by the script into this [ | * The reports are automatically built by the script into this [https://genecats.gi.ucsc.edu/git-reports location]. | ||
Press Ctrl-R to refresh your browser. | |||
Briefly review the reports quickly as a sanity check. | Briefly review the reports quickly as a sanity check. | ||
===Generate review pairings=== | ===Generate review pairings=== | ||
( | (Clay takes care of this) | ||
* Assign code-review partners in redmine. | * Assign code-review partners in redmine. | ||
<BR><BR> | <BR><BR> | ||
Line 149: | Line 165: | ||
===Run Git Reports=== | ===Run Git Reports=== | ||
* Connect as "<code>build</code>" to <code> | * Connect as "<code>build</code>" to <code>hgwdev</code>. Then go to the weekly build dir on dev | ||
hgwdev> ssh -X build@ | hgwdev> ssh -X build@hgwdev # the optional '-X' allows X-windows support | ||
<build@ | <build@hgwdev> cd $WEEKLYBLD | ||
* make sure you are on master branch | * make sure you are on master branch | ||
<build@ | <build@hgwdev> screen # use screen if you wish | ||
<build@hgwdev> git checkout master # or just: git branch | |||
<build@hgwdev> git pull | |||
* edit buildEnv.csh: change the 7th line then the 6th line | * edit buildEnv.csh: change the 7th line then the 6th line | ||
<build@ | <build@hgwdev> vi buildEnv.csh # use your preferred editor | ||
< setenv LASTREVIEW2DAY 2012-06-19 # v269 preview2 | < setenv LASTREVIEW2DAY 2012-06-19 # v269 preview2 | ||
> setenv LASTREVIEW2DAY 2012-07-10 # v270 preview2 | > setenv LASTREVIEW2DAY 2012-07-10 # v270 preview2 | ||
Line 165: | Line 183: | ||
* re-source buildEnv.csh and check that vars are correct | * re-source buildEnv.csh and check that vars are correct | ||
<build@ | <build@hgwdev> source buildEnv.csh # or just restart your shell windows | ||
<build@ | <build@hgwdev> env | grep 2DAY | ||
* commit the changes to this file to Git: | * commit the changes to this file to Git, replacing TICKETID with the Redmine ticket number for the build chatter ticket. | ||
<build@ | See email subject "CGI Release Chatter": | ||
<build@ | <build@hgwdev> @ NEXTNN = ( $BRANCHNN + 1 ) | ||
<build@hgwdev> git commit -m "v$NEXTNN preview2, refs #TICKETID" buildEnv.csh | |||
<build@hgwdev> git push | |||
* run doNewReview2.csh | * run doNewReview2.csh | ||
<build@ | <build@hgwdev> ./doNewReview2.csh # review the variables | ||
* ! # ! REMEBER TO RUN THE preview2TablesTestRobot.csh after this build is done ! # ! See below. | |||
* run for real and direct output to a log file (this takes about 2 minutes - it runs git reports by ssh'ing to hgwdev) | * run for real and direct output to a log file (this takes about 2 minutes - it runs git reports by ssh'ing to hgwdev) | ||
<build@ | <build@hgwdev> ./doNewReview2.csh real >& logs/v$NEXTNN.doNewRev2.log | ||
===Check the reports=== | ===Check the reports=== | ||
* The reports are automatically built by the script into this [ | * The reports are automatically built by the script into this [https://genecats.gi.ucsc.edu/git-reports location]. | ||
Press Ctrl-R to refresh your browser. | |||
Briefly review the reports quickly as a sanity check. | Briefly review the reports quickly as a sanity check. | ||
===Run the tables test robot=== | |||
(Takes 1 hour 40 minutes) | |||
<build@hgwdev> time ./preview2TablesTestRobot.csh >& logs/v$NEXTNN.preview2.hgTables.log | |||
<build@hgwdev> ctrl-a, d # to detach from screen | |||
<build@hgwdev> source buildEnv.csh # just to be sure | |||
<build@hgwdev> @ NEXTNN = ( $BRANCHNN + 1 ) | |||
<build@hgwdev> tail -f logs/v$NEXTNN.preview2.hgTables.log | |||
===Generate review pairings=== | ===Generate review pairings=== | ||
( | (Clay takes care of this) | ||
* Assign code-review partners in redmine. | * Assign code-review partners in redmine. | ||
<BR><BR> | <BR><BR> | ||
Line 192: | Line 221: | ||
==Final Build : Day 15 == | ==Final Build : Day 15 == | ||
'''This is day 15 in the [[CGI_Build_Schedule | schedule]].''' | '''This is day 15 in the [[CGI_Build_Schedule | schedule]].''' | ||
===Optionally Check TrackDb=== | |||
* [Not necessary] Optionally, as your regular user, first check that trackDb builds successfully: | |||
<pre> | |||
<user@hgwdev> cd ~/kent/src/hg/makeDb/trackDb | |||
<user@hgwdev> make beta &> make.strict.log | |||
<user@hgwdev> /bin/egrep -iv "html missing" make.strict.log | /bin/egrep -i "error|warn" make.strict.log | grep -v ignored | wc -w | |||
# if anything other than zero there were problems, look at make.strict.log and send email | |||
# to get the person who broke hgwbeta to fix it | |||
<user@hgwdev> rm make.strict.log | |||
</pre> | |||
===Do the Build=== | ===Do the Build=== | ||
* Connect as "<code>build</code>" to <code> | * Connect as "<code>build</code>" to <code>hgwdev</code>. Then go to the weekly build dir on dev | ||
hgwdev> ssh -X build@ | hgwdev> ssh -X build@hgwdev # the optional '-X' allows X-windows support | ||
<build@ | <build@hgwdev> cd $WEEKLYBLD | ||
* At this point, it might be a good idea to confirm that the build account is still able to log in to the servers we plan to use. Some of these scripts aren't great about failure detection, and in any case a manual resume is more work. An ounce of prevention and all that. | |||
<build@hgdev> ./checkLogins.sh | |||
This currently tests that the stored SSH keys are up-to-date for genome-euro and hgdownload, using those names (i.e. the ones used in the build scripts). If there is an error in connecting to a server, fix that first before moving on with the wrap-up. | |||
* make sure you are on master branch and there is no uncommitted script change | * make sure you are on master branch and there is no uncommitted script change | ||
<build@ | <build@hgwdev> screen # NOTE: screen is recommended this time! | ||
<build@ | <build@hgwdev> git checkout master | ||
<build@hgwdev> git status | |||
<build@hgwdev> git pull | |||
* edit the buildEnv.csh file | * edit the buildEnv.csh file | ||
<build@ | <build@hgwdev> vi buildEnv.csh # use your preferred editor | ||
< setenv LASTWEEK 2012-06-26 # v269 final | < setenv LASTWEEK 2012-06-26 # v269 final | ||
> setenv LASTWEEK 2012-07-17 # v270 final | > setenv LASTWEEK 2012-07-17 # v270 final | ||
Line 214: | Line 261: | ||
* re-source buildEnv.csh and check that vars are correct | * re-source buildEnv.csh and check that vars are correct | ||
<build@ | <build@hgwdev> source buildEnv.csh # or just restart your shell windows | ||
<build@ | <build@hgwdev> env | egrep "DAY|NN|WEEK" | ||
* commit the changes to this file to Git | * commit the changes to this file to Git. Note that you will need to fetch the build ticket number from Redmine first. It should be CGI Build ticket labeled "vNNN CGI Release Chatter" - it will be in the UCSC project in Redmine, not the GB project. | ||
<build@ | <build@hgwdev> git commit -m "v$BRANCHNN final build, refs #NNNNN" buildEnv.csh | ||
<build@ | <build@hgwdev> git push | ||
* run doNewBranch.csh | * run doNewBranch.csh | ||
<build@ | <build@hgwdev> ./doNewBranch.csh # review the variables | ||
* run for real, send the output to a file and review while it is written (takes ~1 hour) | * run for real, send the output to a file and review while it is written (takes ~1 hour) | ||
<build@ | <build@hgwdev> ./doNewBranch.csh real >& logs/v${BRANCHNN}.doNewBranch.log | ||
<build@ | <build@hgwdev> ctrl-a, d # to detach from the screen | ||
<build@ | <build@hgwdev> source buildEnv.csh # just to be sure | ||
<build@hgwdev> tail -f logs/v${BRANCHNN}.doNewBranch.log # follow what happens | |||
* look for files that tell you it was successful (script will report whether these files were created): | * look for files that tell you it was successful (script will report whether these files were created): | ||
<build@ | <build@hgwdev> ls -l /cluster/bin/build/scripts/GitReports.ok | ||
* Check timestamp of CGIs in | * Check timestamp of CGIs in /usr/local/apache/cgi-bin-beta | ||
<build@hgwdev> ls -ltr /usr/local/apache/cgi-bin-beta | |||
* Check the version number in the hgwbeta browser title header. | |||
https://hgwbeta.soe.ucsc.edu/cgi-bin/hgTracks | |||
* If you get errors it might be because the script is wrong, rather than there actually be an error. For example, to check for errors, the 'make' log file is grepped for 'error|warn' so any new 'C' file with error or warn in its name will show up as an error whether or not it compiled cleanly. You might need to change the script to remove references to files like this, eg edit buildBeta.csh to ignore references to files like gbWarn.c and gbWarn.o in the log: | * If you get errors it might be because the script is wrong, rather than there actually be an error. For example, to check for errors, the 'make' log file is grepped for 'error|warn' so any new 'C' file with error or warn in its name will show up as an error whether or not it compiled cleanly. You might need to change the script to remove references to files like this, eg edit buildBeta.csh to ignore references to files like gbWarn.c and gbWarn.o in the log: | ||
<build@ | <build@hgwdev> edit buildBeta.csh | ||
... | ... | ||
make | make beta >& make.beta.log | ||
# These flags and programs will trip the error detection | # These flags and programs will trip the error detection | ||
sed -i -e "s/-DJK_WARN//g" make. | sed -i -e "s/-DJK_WARN//g" make.beta.log | ||
sed -i -e "s/-Werror//g" make. | sed -i -e "s/-Werror//g" make.beta.log | ||
#-- report any compiler warnings, fix any errors (shouldn't be any) | #-- report any compiler warnings, fix any errors (shouldn't be any) | ||
#-- to check for errors: | #-- to check for errors: | ||
set res = `/bin/egrep -i "error|warn" make. | set res = `/bin/egrep -i "error|warn" make.beta.log | /bin/grep -v "gbWarn.o -c gbWarn.c" | /bin/grep -v "gbExtFile.o gbWarn.o gbMiscDiff.o"` | ||
* What the doNewBranch.csh script does: | * What the doNewBranch.csh script does: | ||
Line 256: | Line 307: | ||
===Check the reports=== | ===Check the reports=== | ||
* The reports are automatically built by the script into this [ | * The reports are automatically built by the script into this [https://genecats.gi.ucsc.edu/git-reports location]. | ||
* Press Ctrl-R to refresh your browser. | |||
* Briefly review the reports quickly as a sanity check. | * Briefly review the reports quickly as a sanity check. | ||
===Run the Robots=== | ===Run the Robots=== | ||
* ['''build-meister'''] run doRobots.csh, and watch the log if you are interested (most log messages go to the logs/ dir mentioned below) | * ['''build-meister'''] run doRobots.csh, and watch the log if you are interested (most log messages go to the logs/ dir mentioned below) | ||
<build@ | ssh -X build@hgwdev | ||
<build@ | <build@hgwdev> screen # startup a new screen | ||
<build@ | <build@hgwdev> ./doRobots.csh >& logs/v$BRANCHNN.robots.log | ||
<build@ | <build@hgwdev> ctrl-a, d # detach from screen | ||
<build@hgwdev> source buildEnv.csh # just to be sure | |||
<build@hgwdev> tail -f logs/v$BRANCHNN.robots.log | |||
* What the doRobots.csh script does: | * What the doRobots.csh script does: | ||
Line 284: | Line 328: | ||
## LiftOverTest (quick) | ## LiftOverTest (quick) | ||
* [''' | * ['''push shepherd'''] Review the error logs for the robots: | ||
error logs located here: | error logs located here: hgwdev:/cluster/bin/build/scripts/logs | ||
* hgNear -- sends email with results | * hgNear -- sends email with results | ||
* hgTables -- send email with results | * hgTables -- send email with results | ||
* TrackCheck -- must check by hand: grep -i "error" logs/ | * TrackCheck -- must check by hand: grep -i "error" logs/v$BRANCHNN.TrackCheck.log (''TrackCheck person does this'') | ||
* LiftOverTest -- must check by hand: cat logs/LiftOverTest-v$BRANCHNN.log | * LiftOverTest -- must check by hand: cat logs/v$BRANCHNN.LiftOverTest.log | ||
NOTE: These robot tests take more than 6 hours, do not wait for them, | |||
do the rest of the steps like GBIB. | |||
=== Genome Browser in a Box === | |||
Notes by Max: First of all, changes to GBIB are never made manually through SSH but through modifications of the updateBrowser.sh script. In this way, all running GBIBs get it. The only change the buildmeister makes is to run this update script to make all modifications to the current GBIB image. | |||
* The <code>build</code> account on <code>hgwdev</code> operates this procedure | |||
ssh -X build@hgwdev | |||
<build@hgwdev> cd $WEEKLYBLD | |||
The manual procedure is: | |||
* Start the browser box: | |||
<build@hgwdev> VBoxHeadless -s browserbox & | |||
* login to the box: | |||
<build@hgwdev> ssh box | |||
* wait for rsync updates to finish *and* any dpkg unattended_upgrades. Look for 'sleep' commands, dpkg and sync: | |||
<browser@browserbox> ps -ef | egrep -i "sleep|dpkg|sync" | grep -v grep | |||
* update the update script updateBrowser.sh with command defined as an shell alias: | |||
<browser@browserbox> gbibCoreUpdateBeta | |||
* su to root account | |||
<browser@browserbox> su - # password "browser" | |||
* run an update to get an new required tables downloaded: | |||
<root@browserbox> ./updateBrowser.sh notSelf | |||
* run the update script to get the new CGIs: | |||
<root@browserbox> ./updateBrowser.sh hgwdev galt beta | |||
* NOTE: replace "galt" with your username; | |||
* It logs in as you rsync CGIs directly from remote machines like hgwbeta. | |||
* System may reboot with OS upgrades, after reboot, run the same updateBrowser.sh again. | |||
DO NOT FORGET | |||
* IMPORTANT this file needs to be removed manually | |||
* This could be fixed in the updateBrowser.sh script by using an '''eval''' on the rsync statements. | |||
<root@browserbox> rm /usr/local/apache/cgi-bin/hg.conf.private | |||
* after updateBrowser.sh has run successfully to completion without reboot, can now continue with the packaging | |||
* logout from box: | |||
<root@browserbox> exit | |||
<browser@browserbox> exit | |||
* (can take 25 minutes for boxRelease.csh) | |||
<build@hgwdev> time ./boxRelease.csh beta >& logs/v${BRANCHNN}.boxRelease.log | |||
<build@hgwdev> cp -p /usr/local/apache/htdocs/gbib/gbibBeta.zip /hive/groups/browser/vBox/gbibV${BRANCHNN}.zip | |||
* The script procedure does not function correctly because the updateBrowser.sh script now performs OS updates which require one or more reboots to complete. The buildGbibAndZip.csh script, runs the commands as the build user. The steps are archived for posterity but do not work correctly: | |||
** It starts the browserbox vm if it is not already running. | |||
** It updates the box, during which it rsyncs from gbib to hgwdev using a temporary public key. | |||
** It builds a release gbib.zip and also saves it with the the current version to a backup location. | |||
** time ./buildGbibAndZip.csh >& logs/v${BRANCHNN}.boxRelease.log (takes 20 minutes) | |||
** Examine log for errors: | |||
** less logs/v${BRANCHNN}.boxRelease.log | |||
** If it gets a lot of errors, this is often due to it having the vm update and reboot itself, which kills the ssh connection that buildGbibAndZip.csh is trying to use. It does not do a great job of detecting the problem. However, it often fails withing 80 seconds. You can just re-run buildGbibAndZip.csh again as above, and check that it succeeded. It should take 20 minutes to run when working normally. | |||
** END OF ARCHIVED SCRIPT PROCEDURE | |||
=== Update and Restart qateam beta GBiB === | |||
As user "build" we have access to $WEEKLYBLD and $BRANCHNN env variables. | |||
ssh -X build@hgwdev # this may have already been done | |||
This script may or may not update the qateam browser box. | |||
Run a script to automatically stop the old browserboxbeta, unregister it, mv old to backup name, unzip a fresh copy from browserbox, rename it to browserboxbeta, re-register it, and restart it. | |||
ssh -X qateam@hgwdev $WEEKLYBLD/updateBrowserboxbeta.csh $BRANCHNN >& logs/v${BRANCHNN}.updateBrowserboxbeta.log | |||
I use the following manual steps: | |||
<pre> | |||
ssh qateam@hgwdev | |||
# To see what may be running: | |||
VBoxManage list runningvms | |||
"browserbox" {8e474be8-2808-466a-930d-5e6670ca1cb1} | |||
# or, view all VMs: | |||
VBoxManage list vms | |||
"browserboxalpha" {9442ad82-5672-4ea1-aeab-b40fc9ae691f} | |||
"browserboxbeta" {8e474be8-2808-466a-930d-5e6670ca1cb1} | |||
# To stop the betabox: | |||
VBoxManage controlvm browserboxbeta acpipowerbutton | |||
# if you get an error about unregister because the box is locked, find the process | |||
# using the vm and kill it, then proceed with the unregister: | |||
# ps -ef | grep boxbeta | |||
# run a kill command with the pid from the ps output | |||
# run the unregister command again | |||
# to unregister the betabox: | |||
VBoxManage unregistervm browserboxbeta | |||
# reorganize directories | |||
cd "VirtualBox VMs" | |||
mv browserboxbeta browserboxbeta.v${BRANCHNN} # BRANCHNN does not exist on qateam account, also remember that this is the | |||
# OLD BRANCHNN value, as it's a preservation of the previous build's beta box. | |||
mkdir browserboxbeta | |||
cd browserboxbeta | |||
unzip /usr/local/apache/htdocs/gbib/gbibBeta.zip | |||
# change the ports to use and the VM image name: | |||
sed -e 's/1234/1236/; s/1235/1237/; s/browserbox/browserboxbeta/;' \ | |||
browserbox.vbox > browserboxbeta.vbox | |||
# to register this new image: | |||
cd | |||
VBoxManage registervm `pwd`/"VirtualBox VMs/browserboxbeta/browserboxbeta.vbox" | |||
# To start this betabox: | |||
nice -n +19 VBoxHeadless -s browserboxbeta & | |||
# to login to the betabox: (wait a few moments for it to get fully started) | |||
ssh -p 1237 browser@localhost | |||
</pre> | |||
* test WEB server and login account: | |||
Does it seem to return the index.html ok? | |||
wget <nowiki>http://localhost:1236/index.html</nowiki> -O /dev/stdout | |||
htmlCheck getAll <nowiki>http://localhost:1236/index.html</nowiki> | |||
Do you see the correct CGI version? | |||
ssh -X qateam@hgwdev "wget http://localhost:1236/cgi-bin/hgTracks -O /dev/stdout | grep '<TITLE'" | |||
htmlCheck getAll <nowiki>http://localhost:1236/cgi-bin/hgTracks</nowiki> | grep '<TITLE' | |||
===Test on | Can you login to the vm interactively? | ||
* Wait to hear from QA about how their CGIs look on hgwbeta. QA members should update the CGI build chatter ticket in Redmine with a "done testing" message or, if applicable, "not following issues for this release" message. Each member of the QA team has [ | ssh -X qateam@hgwdev | ||
ssh boxBeta # use password 'browser' to login - uses .ssh/config to get the port. | |||
exit # exit from vm | |||
exit # exit from qateam | |||
If you want to be sure that the vm has been correctly updated, | |||
has the right version, or confirm that the latest patch is working there, | |||
you can browse on browserboxbeta vm which is on hgwdev on port 1236 | |||
via SSH tunnel from you local machine on port 9991: | |||
Open a new terminal window on your local machine. | |||
Windows: # change to your own username | |||
plink.exe -N -L 127.0.0.1:9991:127.0.0.1:1236 galt@hgwdev.gi.ucsc.edu | |||
Linux: | |||
ssh -N -L 127.0.0.1:9991:127.0.0.1:1236 $USER@hgwdev.gi.ucsc.edu | |||
Open web browser: | |||
http://127.0.0.1:9991/ | |||
On your local machine terminal window, press control-c to terminate ssh or plink. | |||
===Update kent-core=== | |||
* update kent-core as yourself, not the build user. | |||
TODO maybe this should just be run at Final Build Wrap-up? | |||
TODO make a build service account at github that would be able to support this. | |||
* Make sure your ssh keys are set up, this command should work: | |||
<pre> | |||
ssh -i .ssh/github -T git@github.com | |||
</pre> | |||
* If that doesn't work, follow these directions to get your ssh keys setup with your github account: https://docs.github.com/en/authentication/connecting-to-github-with-ssh/generating-a-new-ssh-key-and-adding-it-to-the-ssh-agent | |||
* Note that if you have multiple ssh keys for your hgwdev user (likely), you can tell ssh to use a specific key with a .ssh/config file: | |||
<pre> | |||
cat ~/.ssh/config | |||
Host github.com | |||
IdentityFile ~/.ssh/github | |||
chmod 600 ~/.ssh/config | |||
</pre> | |||
* If you haven't done it yet, clone this Github repo into your ${HOME}/kent-core by doing: | |||
cd $HOME | |||
git clone git@github.com:ucscGenomeBrowser/kent-core.git | |||
* go to the top-level kent (note kent-core) directory of the tree and run this: | |||
cd $HOME | |||
cd kent-core | |||
git pull | |||
cd /hive/groups/browser/newBuild/kent | |||
build/kent-core/makeKentCore | |||
* the command will update ~/kent-core with selected files and then suggest a git commit command for you to update it | |||
* this git command can be added to build/kent-core/makeKentCore in the future, just keeping this step manual for now, so we can test the procedure (Mar 2023). | |||
* It does NOT handle subversions yet, if you have multiple patches and therefore kent-core tags made. | |||
===Generate the code summaries and review pairings=== | |||
(Clay takes care of this) | |||
* Assign code-review partners in redmine. | |||
(Lou Takes care of this) | |||
* Summarize the code changes that were committed during the past week. Solicit input from the engineers. | |||
* Update these pages with the summary: | |||
** [https://hgwdev.gi.ucsc.edu/builds/versions.html versions] | |||
** [http://genomewiki.soe.ucsc.edu/index.php/Genome_Browser_Software_Features Features] | |||
* Send an email to browser-staff with links to the summaries. | |||
===Test on hgwdev=== | |||
* Wait to hear from QA about how their CGIs look on hgwbeta. QA members should update the CGI build chatter ticket in Redmine with a "done testing" message or, if applicable, "not following issues for this release" message. Each member of the QA team has [https://hgwdev.gi.ucsc.edu/qa/cgiTesting.html testing responsiblities]. | |||
<BR><BR> | <BR><BR> | ||
==Make changes to code base as necessary== | ==Make changes to code base as necessary== | ||
'''This happens on days 15, 16, 17, and 18 in the [[CGI_Build_Schedule | schedule]].''' | '''This happens on days 15, 16, 17, and 18 in the [[CGI_Build_Schedule | schedule]].''' | ||
* If there are problems with the build a developer will fix the code. This fix needs to be patched into the build on | * If there are problems with the build a developer will fix the code. This fix needs to be patched into the build on hgwdev. This [[Cherry-picking_a_change_in_git | page]] explains how to do a Cherry Pick on hgwdev. | ||
==Fixing problems in the Build == | ==Fixing problems in the Build == | ||
Line 323: | Line 552: | ||
* hgw0 is identical to the RR machines but not actually in the RR (i.e. changes there are not seen by the public). | * hgw0 is identical to the RR machines but not actually in the RR (i.e. changes there are not seen by the public). | ||
* QA will send an email to push-request the morning of the push letting the pushers know that today is a CGI push day (this is their notice to be vigilant about pushing quickly). | * QA will send an email to push-request the morning of the push letting the pushers know that today is a CGI push day (this is their notice to be vigilant about pushing quickly). | ||
* QA will ask for push of CGIs from | * QA will ask for push of CGIs from hgwdev to hgw0 only. IF there is a '''NEW CGI or file(s)''' going out this week, be sure to make a prominent note of it in your push request. The admins push from a script, and they will need to add your new CGI to the script. (the build-meister should not be cc'd on this email.) | ||
As of ''' | As of '''April 2019''', here's a list of the CGIs and data files we push. Note: CGIs and data files may have been added since this list was created -- this is meant to be a starting point. | ||
cartDump cartReset das hgc | cartDump cartReset das hgApi hgBeacon hgBlat hgc hgCollection hgConvert hgCustom | ||
hgEncodeApi hgEncodeDataVersions hgEncodeVocab hgFileSearch hgFileUi | |||
hgLiftOver hgNear hgPcr hgSession hgTables hgTracks hgTrackUi | hgGateway hgGene hgGeneGraph hgGenome hgGtexTrackSettings hgHubConnect | ||
hgIntegrator hgLinkIn hgLiftOver hgLogin hgMenubar hgMirror hgNear | |||
hgPal hgPcr hgPublicSessions hgRenderTracks hgSession hgSuggest | |||
hgTables hgTracks hgTrackUi hgUserSuggestion hgVai hgVisiGene hubApi phyloPng | |||
hgPhyloPlace hgPhyloPlaceData | |||
and these configuration files: | and these configuration files: | ||
/usr/local/apache/cgi-bin/all.joiner | /usr/local/apache/cgi-bin/all.joiner | ||
/usr/local/apache/cgi-bin/ | /usr/local/apache/cgi-bin/extTools.ra | ||
/usr/local/apache/cgi-bin/greatData/* | |||
/usr/local/apache/cgi-bin/hgCgiData/* | |||
/usr/local/apache/cgi-bin/hgGeneData/* | |||
/usr/local/apache/cgi-bin/hgNearData/* | /usr/local/apache/cgi-bin/hgNearData/* | ||
/usr/local/apache/cgi-bin/hgcData/* | /usr/local/apache/cgi-bin/hgcData/* | ||
/usr/local/apache/cgi-bin/loader/* | /usr/local/apache/cgi-bin/loader/* | ||
/usr/local/apache/cgi-bin/lsSnpPdbChimera.py | |||
/usr/local/apache/cgi-bin/visiGeneData/* | /usr/local/apache/cgi-bin/visiGeneData/* | ||
/usr/local/apache/cgi-bin/ | /usr/local/apache/cgi-bin/pyLib/* | ||
For these directories we request an rsync --delete (from hgwbeta to the RR) | For these directories we request an rsync --delete (from hgwbeta to the RR) | ||
/usr/local/apache/htdocs/js/* | /usr/local/apache/htdocs/js/* | ||
/usr/local/apache/htdocs/style/* | /usr/local/apache/htdocs/style/* | ||
* Run TrackCheck on hgw0. This is the responsibility of the QA person who tests hgTracks. | * Run TrackCheck on hgw0. This is the responsibility of the QA person who tests hgTracks. | ||
# make a props file which specifies the machine/db to check. Set zoomCount=1 and it will only check the default position for each assembly. Example props file for hgw0: | # make a props file which specifies the machine/db to check. Set zoomCount=1 and it will only check the default position for each assembly. Example props file for hgw0:<br /><blockquote><pre>machine mysqlbeta.soe.ucsc.edu #This is where TrackCheck checks for active databases (active=1). These databases may not be on the the RR and it will give errors which can be ignore.</pre><pre>server hgw0.soe.ucsc.edu # this is the machine that you are testing.</pre><pre>quick false</pre><pre>dbSpec all #You can list just one database here.</pre><pre>table all #You can list one table if need be.</pre><pre>zoomCount 1 # if number is greater than one it will check links at higher zoom levels.</pre></blockquote> | ||
# run it from hgwdev: <code>nohup TrackCheck hgw0.props > & $WEEKLYBLD/logs/TrackCheck-hgw0.07-13-2006</code> | |||
#* run in the background if desired by typing Ctrl-Z then "<code>bg</code>", to check status type "<code>jobs</code>" or "<code>ps -ef | grep TrackCheck</code>". | |||
# run it from hgwdev: | |||
# | |||
# examine the file for errors. | # examine the file for errors. | ||
* Monitor Apache Error Log (QA does this): | * Monitor Apache Error Log (QA does this) see examples [[Apache_error_log_output | here]]: | ||
hgw0:/usr/local/apache/logs/error_log | |||
To watch the log without line wraps, type "[http://en.wikipedia.org/wiki/Less_(Unix) less -S] error_log." Typing capital "F" will all you to follow incoming errors. When errors arrise you can type Ctrl-C and use the right arrow to scroll the window over to see the entire message. | To watch the log without line wraps, type "[http://en.wikipedia.org/wiki/Less_(Unix) less -S] error_log." Typing capital "F" will all you to follow incoming errors. When errors arrise you can type Ctrl-C and use the right arrow to scroll the window over to see the entire message. | ||
'''*Update*''': To view the error log *without* Hiram's CGI_TIME entries (for background info see: http://redmine.soe.ucsc.edu/issues/10081): | '''*Update*''': To view the error log *without* Hiram's CGI_TIME entries (for background info see: http://redmine.soe.ucsc.edu/issues/10081): | ||
$ tail -f error_log | grep -v CGI_TIME | $ tail -f error_log | grep -v CGI_TIME | ||
* Wait to hear from QA about how their CGIs look on hgw0. Each member of the QA team has [ | * Wait to hear from QA about how their CGIs look on hgw0. Each member of the QA team has [https://hgwdev.gi.ucsc.edu/qa/cgiTesting.html testing responsiblities]. Check also that TrackCheck ran successfully. | ||
===Push to hgwN only=== | ===Push to hgwN only=== | ||
* hgwN is one of the RR machines, hgw1- | * hgwN is one of the RR machines, hgw1-6. Each build, rotate to the next machine in numeric order i.e. hgw1 then hgw2 etc. so that one machine is not being worked more than the others. | ||
* Once the new CGIs are on hgwN the push shepherd will watch the error logs for a short while to make sure no new errors occur under load. | * Once the new CGIs are on hgwN the push shepherd will watch the error logs for a short while to make sure no new errors occur under load. | ||
Line 397: | Line 621: | ||
RR machines have been updated. | RR machines have been updated. | ||
Normally this is run once at the end of the cycle. However, occassionally it is necessary to patch a build after it is already released on the RR. Depending upon the extent of the patch, it may be desirable or even necessary to rerun the wrap-up. All of these scripts can be safely rerun | Normally this is run once at the end of the cycle. However, occassionally it is necessary to patch a build after it is already released on the RR. Depending upon the extent of the patch, it may be desirable or even necessary to rerun the wrap-up. All of these scripts can be safely rerun until the next build is made (until BRANCHNN is updated in buildEnv.csh). | ||
* Connect as "<code>build</code>" on <code>hgwdev</code> this time. Then go to the weekly build dir on | * Connect as "<code>build</code>" on <code>hgwdev</code> this time. Then go to the weekly build dir on dev | ||
hgwdev> ssh -X build@hgwdev # the optional '-X' allows X-windows support | hgwdev> ssh -X build@hgwdev # the optional '-X' allows X-windows support | ||
<build@hgwdev> cd $WEEKLYBLD | <build@hgwdev> cd $WEEKLYBLD | ||
* build and push hgcentral IF there are any changes | * build and push hgcentral IF there are any changes | ||
<build@hgwdev> ./buildHgCentralSql.csh | <build@hgwdev> ./buildHgCentralSql.csh >& logs/v${BRANCHNN}.buildHgCentralSql.log | ||
<build@hgwdev> ./buildHgCentralSql.csh real >& buildHgCentralSql.log | <build@hgwdev> cat logs/v${BRANCHNN}.buildHgCentralSql.log | ||
<build@hgwdev> ./buildHgCentralSql.csh real >>& logs/v${BRANCHNN}.buildHgCentralSql.log | |||
<build@hgwdev> echo $status | |||
* check that the hgcentral.sql has been updated: http://hgdownload | * check that the hgcentral.sql has been updated: | ||
http://hgdownload.soe.ucsc.edu/admin/ | |||
* build 'userApps' target (various utilities) on hgwdev and scp them to hgdownload | |||
<build@hgwdev> screen # if desired | |||
<build@hgwdev> time ./doHgDownloadUtils.csh >& logs/v${BRANCHNN}.doHgDownloadUtils.log | |||
<build@hgwdev> echo $status | |||
(takes 45 minutes) | |||
* check that the utils | * check that dates on the utils to verify they were updated: | ||
http://hgdownload.soe.ucsc.edu/admin/exe/linux.x86_64 | |||
http://hgdownload.soe.ucsc.edu/admin/exe/linux.x86_64/blat/ | |||
* update the beta tag to match the release: | * update the beta tag to match the release: | ||
<build@ | <build@hgwdev> cd $WEEKLYBLD | ||
<build@ | <build@hgwdev> env | egrep "BRANCHNN|TODAY|LASTWEEK" # (just to make sure it looks right) | ||
<build@ | <build@hgwdev> ./tagBeta.csh >&logs/v${BRANCHNN}.tagBeta.log | ||
<build@ | <build@hgwdev> cat logs/v${BRANCHNN}.tagBeta.log | ||
<build@hgwdev> ./tagBeta.csh real >>&logs/v${BRANCHNN}.tagBeta.log | |||
<build@hgwdev> echo $status | |||
* if fails because very large thing during the release, temporarily increase limit in kent repo, finished the steps from tagBeta.csh manually, starting at this step: git push origin beta | |||
* Be sure to restore the limit in kent repo when done. | |||
* tag the official release | * tag the official release | ||
<build@ | <build@hgwdev> cd $WEEKLYBLD | ||
<build@ | <build@hgwdev> git fetch | ||
<build@ | <build@hgwdev> git tag | grep "v${BRANCHNN}_branch" | ||
# Note | # Note | ||
<build@ | # if the output is empty, that is fine it just means none have been created yet, so make branch.1 | ||
<build@ | # otherwise use the next available branch .2 or whatever is the next unused subversion number | ||
<build@hgwdev> git push origin origin/v${BRANCHNN}_branch:refs/tags/v${BRANCHNN}_branch.1 | |||
<build@hgwdev> git fetch | |||
* zip the source code | * zip the source code | ||
<build@ | <build@hgwdev> cd $WEEKLYBLD | ||
<build@ | <build@hgwdev> time ./doZip.csh >&logs/v${BRANCHNN}.doZip.log # this is automatically pushed to hgdownload | ||
<build@hgwdev> echo $status | |||
(takes 4 minutes) | |||
* check that the source code .zip files were updated: | |||
http://hgdownload.soe.ucsc.edu/admin/ | |||
* <B>WAIT</B> 10 minutes, then run ./userApps.sh to package up the userApps/ directory with its source and pushes it to hgdownload htdocs/admin/exe/ | |||
time ./userApps.sh >& logs/v${BRANCHNN}.userApps.log | |||
(takes 1 minute) | |||
* check that dates on the userApps src.tgz to verify they were updated: | |||
http://hgdownload.soe.ucsc.edu/admin/exe | |||
* The macOSX binaries can now also be constructed | |||
# on your Mac laptop, you should have a kent source tree on the beta branch | |||
cd $HOME/kent/src/userApps | |||
git branch | |||
# * beta | |||
# master | |||
# verify the git pull brings in the next version, check before and after pull | |||
grep CGI ../hg/inc/versionInfo.h | |||
git pull | |||
grep CGI ../hg/inc/versionInfo.h | |||
# should see the new version number | |||
# remove the bin directory and update this source and build new binaries | |||
rm -fr bin | |||
make update | |||
# the constructed index page needs to get to hgwdev: | |||
scp -p kentUtils.Documentation.txt yourUserName@hgwdev.gi.ucsc.edu:/tmp/ | |||
# one of the binaries always makes an extra unneeded directory | |||
find ./bin -type d | |||
# ./bin | |||
# ./bin/bedToExons.dSYM | |||
# ./bin/bedToExons.dSYM/Contents | |||
# ./bin/bedToExons.dSYM/Contents/Resources | |||
# ./bin/bedToExons.dSYM/Contents/Resources/DWARF | |||
# do not need the bedToExons.dSYM directory | |||
rm -fr bin/bedToExons.dSYM | |||
# the blat binaries need to be in their own directory | |||
mkdir bin/blatexe | |||
mv bin/blat bin/blatexe | |||
mv bin/blatexe bin/blat | |||
mv bin/gfServer bin/blat | |||
mv bin/gfClient bin/blat | |||
# this bin directory can be copied to hgwdev | |||
cd bin | |||
rsync -a -P ./ yourUserName@hgwdev.gi.ucsc.edu:/usr/local/apache/htdocs-hgdownload/admin/exe/macOSX.x86_64/ | |||
# check that dynamic libraries make sense. The command on Mac OSX to show | |||
# dynamic libraries is: otool -L which can be aliased to: alias ldd='otool -L' thus: | |||
ldd ./hgsql | |||
# /usr/lib/libc++.1.dylib (compatibility version 1.0.0, current version 400.9.4) | |||
# /usr/lib/libSystem.B.dylib (compatibility version 1.0.0, current version 1252.250.1) | |||
# Now, back on hgwdev, | |||
# verify the change in the index page makes sense | |||
cd /usr/local/apache/htdocs-hgdownload/admin/exe/macOSX.x86_64 | |||
diff FOOTER.txt /tmp/kentUtils.Documentation.txt | |||
# extra credit, update README.txt to indicate this release version and note the dynamic | |||
# libaries and Mac OSX version that was used | |||
# if that is reasonable, it can replace FOOTER.txt | |||
cp -p /tmp/kentUtils.Documentation.txt ./FOOTER.txt | |||
# copy to hgdownload to release these binaries: | |||
rsync -a -P ./ qateam@hgdownload.soe.ucsc.edu:/mirrordata/apache/htdocs/admin/exe/macOSX.x86_64/ | |||
# take a look there to see if it makes sense | |||
# | |||
* Push to the genome browser store from hgwdev | |||
ssh hgwdev #open a new terminal and login to hgwdev as yourself, not the build user. | |||
<you@hgwdev> sudo /cluster/bin/scripts/gbib_gbic_push | |||
<you@hgwdev> exit # close the terminal window | |||
* Create a Release on github for the release tag. To start, we can do this by visiting https://github.com/ucscGenomeBrowser/kent/releases/new, selecting the vXXX_branch.X tag, and marking that tag as a release on the master branch. Consider linking to https://genecats.gi.ucsc.edu/builds/versions.html in the release description, or else copying the same text you'll use to describe the release notes in the genome-mirror email (below). In the future we can try installing the "gh" command-line tool to interface with github, then run something like "gh release create v467_branch.1". | |||
* <B>WAIT</B> a day for the nightly rsync to happen from the RR for cgi-bin/ and htdocs/ hierarchies to hgdownload | |||
* confirm cgi-bin/ and htdocs/ on hgdownload are up to date, | |||
ftp://hgdownload.soe.ucsc.edu/apache/cgi-bin/ | |||
ftp://hgdownload.soe.ucsc.edu/apache/htdocs-rr/ | |||
Since browsers do not like ftp much anymore, you can still view those ftp directories on Windows File Explorer. | |||
Note that when you rsync from hgdownloads, | |||
htdocs/ is mapped to htdocs-rr/ internally. | |||
* send email to genome-mirror@soe.ucsc.edu. | * send email to genome-mirror@soe.ucsc.edu. | ||
* Refer to this form making a summary summary: | |||
https://genecats.gi.ucsc.edu/builds/versions.html | |||
Include this link to latest source: http://hgdownload.soe.ucsc.edu/admin/jksrc.zip. | Include this link to latest source: http://hgdownload.soe.ucsc.edu/admin/jksrc.zip. | ||
Use the last email as a template (see https://www.soe.ucsc.edu/pipermail/genome-mirror). | Use the last email as a template (see https://www.soe.ucsc.edu/pipermail/genome-mirror). | ||
If you push the hgcentral.sql, make sure to mention this has also changed in the email. | If you push the hgcentral.sql, make sure to mention this has also changed in the email. If you're having trouble formatting the CGI name list for 80 columns, the UNIX fold -s command should help. | ||
<B>Example:</B> | <B>Example:</B> | ||
<pre> | <pre> | ||
To: genome-mirror@soe.ucsc.edu | To: genome-mirror@soe.ucsc.edu | ||
Subject: | Subject: v457 Genome Browser Available | ||
Good Afternoon Genome Browser Mirror Site Operators: | |||
Version v457 of the UCSC Genome Browser software has now been | |||
released. | |||
The main changes are: | |||
Added (i) icons to hgLiftOver, to explain the various obscure options without so much text on the page. | |||
Note that this uses the new printInfoIcon(text) function, which we can use on many CGIs. | |||
Try again: users can now query old NCBI transcripts, both by name and in hgvs coordinates, for hg38 only. | |||
Fixed problems with RTS load where composite and supertrack children were stuck in positions set by previous RTS loads.. | |||
hgSearch now allows clicking the text to show/hide results instead of just the small button. | |||
Made more improvements to tooltips: left click or contextmenu click closes tooltip, | |||
fix tooltip y positioning when it would normally be above the viewport, left justify the tooltip text. | |||
Documented "accession1;accession2" query in the position box. | |||
Snp hgc pages now use mitochondrial codon table. | |||
Fixed never-used multi-term searches that each resolves to a singlePos structure to show a range with each item highlighted. | |||
Fixed problems with forcing bigBed filtering to pass items found with hgFind. | |||
Staged work in progress hgc code for creating a table of links to associated chains for the chainHRPC tracks. | |||
Made various changes for Chinese internet rules. | |||
Now using forceTwoBit by default, no more .nib files are needed. | |||
Fixed OMIM tracks error on GBIB and GBIC, no redmine, various emails. | |||
nomAD v4 VCF track for hg38. | |||
For a comprehensive list of changes for this version, please | |||
visit/bookmark (you may need to force refresh in your web browser to | |||
see an updated page): | |||
https://genecats.gi.ucsc.edu/builds/versions.html | |||
We typically release a new version every three weeks. A summary of all | |||
past releases and changes can be found at the above link. | |||
/ | The new source code tree is available at: | ||
http://hgdownload.soe.ucsc.edu/admin/jksrc.zip | |||
or labeled with version number at: | |||
/ | http://hgdownload.soe.ucsc.edu/admin/jksrc.v457.zip | ||
or in our Github repository in the "beta" branch and also tagged with | |||
the version number. | |||
If you use the GBIB virtual machine with auto-updates enabled, it will | |||
automatically update itself on Sunday. If you have installed your UCSC | |||
Genome Browser server with the GBIC installation script | |||
browserSetup.sh, then you can upgrade it with the command "sudo bash | |||
browserSetup.sh update". | |||
/ | If you have installed your Genome Browser manually: | ||
/ | - You can use the following rsync command to copy the CGI binaries | ||
into the cgi-bin directory of your Apache server: | |||
rsync -avP rsync://hgdownload.soe.ucsc.edu/cgi-bin/ ${WEBROOT}/cgi-bin/ | |||
- We provide a shell script to update the htdocs directory, htdocs and | |||
cgi-bin must be updated together: | |||
https://github.com/ucscGenomeBrowser/kent/blob/master/src/product/scripts/updateHtml.sh | |||
- The hgcentral database contains our curated list of public track | |||
hubs and pointers to BLAT servers per database and should be updated | |||
regularly. We have released a new MySQL dump of this database: | |||
http://hgdownload.soe.ucsc.edu/admin/hgcentral.sql | |||
A license is required for commercial download and/or installation of | |||
the Genome Browser binaries and source code. No license is needed for | |||
academic, nonprofit, and personal use. More information on our | |||
licensing page: | |||
http://genome.ucsc.edu/license/ | |||
If you have any questions or concerns, please feel free to write back | If you have any questions or concerns, please feel free to write back | ||
to this mail list. | to this genome-mirror mail list. | ||
Galt Barber | |||
UCSC Genomics Institute | |||
</pre> | </pre> | ||
[[Category:Browser QA]] | [[Category:Browser QA]] | ||
[[Category:Browser QA CGI]] | [[Category:Browser QA CGI]] |
Latest revision as of 04:50, 8 October 2024
This page explains the process we use for building and releasing our CGIs. This is done on a three-week schedule.
- Before week1, the source code is called to be in "preview1" state.
- During the first week, any changes by QA or developers are added to the tree
- After week1, the source code is called to be in "preview2" state.
- During the 2nd week, just like in week1, any changes by QA or developers are added to the tree
- After week 2, the final build is compiled and copied to a sandbox
- During the third week development continues, all changes are added and compiled on genome-test, but only bugfixes (build patches) are added to the final build sandbox ("git cherry-pick").
- After week 3, the now bugfixed final build from week two is copied from its sandbox to the public site.
The build after week 2 is built into a sandboxes which is located here:
hgwdev:/usr/local/apache/cgi-bin hgwdev:/usr/local/apache/htdocs-beta
Older hgwdev builds are periodically relocated here:
/hive/groups/browser/build
Setting Up the Environment for the Build
NOTE: The actions in this section are a one-time only set up performed by the new "build-meister".
Becoming the Build-Meister
All build scripts are now run by the "build
" user. This user should already have it's environment properly configured. However, the build-meister will need to be able to log in (though ssh) as the build
user, and the build
user will need to know where to send mail.
- Set up .ssh/authorized_keys so that you can log in as the
build
user. Seek assistance from cluster-admin if you need it.
- Assign the
build
user's BUILDMEISTEREMAIL environmental variable to equal the user name of the new build meister
hgwdev> ssh -X build@hgwdev # the optional '-X' allows X-windows support <build@hgwdev> edit .tcshrc # use your preferred editor # alter the following line: < setenv BUILDMEISTEREMAIL tdreszer > setenv BUILDMEISTEREMAIL chinli
Remember you will need to log out and log back in for the changes to take affect.
- As the new build-meister, git commits from the build user should be associated with your email address. You should change the git configuration to use it:
hgwdev> ssh -X build@hgwdev # the optional '-X' allows X-windows support <build@hgwdev> edit .gitconfig # use your preferred editor # alter the following line: < email = oldbuildmeister@soe.ucsc.edu > email = youremail@soe.ucsc.edu
- Make sure the
build
user's cron jobs send you mail. Unlike the various build scripts which can use theBUILD_MEISTER
environmental variable to find you, cron runs without access to that variable. Instead, you should add yourself to cron'sMAILTO
variable:
hgwdev> ssh -X build@hgwdev # the optional '-X' allows X-windows support <build@hgwdev> crontab -l > cron.txt <build@hgwdev> edit cron.txt # use your preferred editor < MAILTO=rhead,tdreszer,braney,cricket > MAILTO=rhead,chinli,braney,cricket # and < MAILTO=tdreszer,braney > MAILTO=chinli,braney <build@hgwdev> crontab cron.txt <build@hgwdev> crontab -l # verify what you have done.
Optionally set up your own build environment
NOTE: The remainder of this section is historical. Nevertheless, it is worth keeping these details here, especially if the build-meister wishes to try experimental changes under their own identity.
- Before running build scripts as yourself, you will need to set up the following in your log in file:
hgwdev> cd ~ # go to your home directory as yourself. hgwdev> edit .tcshrc # use your preferred editor # add (or update) the following lines: > umask 002 > source /cluster/bin/build/scripts/buildEnv.csh # To be able to run the Java robot programs, add the following to the top of your path setting: > set path = ( /usr/java/default/bin $path ... ) # optionally add helper aliases: > # wb gets you to the scripts dir. > alias wb 'cd $WEEKLYBLD' > # cd $hg gets you to the latest build sandbox > if ( "$HOST" == "hgwdev" ) then > setenv hg $BUILDDIR/v${BRANCHNN}_branch/kent/src/hg > endif
Remember you will need to log out and log back in for the changes to take affect.
NOTE: For those more comfortable with other shells (e.g. bash), it should be possible to run build scripts from another shell. However, the main limitation is the buildEnv.csh
file which is edited and checked in every week, then sourced by .tcshrc
. Without changes it cannot be sourced by .bashrc
.
- Set up autologin among the general cluster machines
# On your local soe box (i.e. screech, pfft, whatever) screech> ssh-keygen -t dsa (use enter for all defaults) screech> cd ~/.ssh, # add yourself to the authorized keys screech> cp id_dsa.pub authorized_keys
Also put these in your home directory on hgwdev
:
screech> scp -r .ssh/ hgwdev: # Permissions on .ssh should be 700. # Permissions on files in .ssh/ should be 600 or 640.
- Set up autologin for hgdownload by copying your public key to the list of authorized keys on those machines. You may need assistance from someone already authorized to login to hgdownload:
hgwdev> edit ~/.ssh/id_dsa.pub # copy the public key into the clipboard and then log into hgdownload as user qateam hgwdev> ssh qateam@hgdownload hgdownload> cd ~/.ssh hgdownload> edit authorized_keys # paste the key to the authorized_keys file
- You will also need a copy of .hg.conf.beta in your $HOME directory. This should be obtained from /cluster/home/build/.hg.conf.beta.
- Build Symlinks. These are critical for building 64 bit utilities
hgwdev> cd ~/bin # Make sure you have $MACHTYPE directories hgwdev> mkdir x86_64 # Create a symlink for each $MACHTYPE hgwdev> ln -s /cluster/bin/x86_64 x86_64.cluster
The symtrick.csh
uses these automatically. If a script crashes and leaves the symlinks in an incorrect state, use unsymtrick.csh
to restore. Build scripts check to see if unsymtrick.csh
should be executed.
Preview1 Day Build : Day 1
This is day 1 in the schedule.
Run Git Reports
- Connect as "
build
" tohgwdev
. Then go to the weekly build dir on dev:
hgwdev> ssh -X build@hgwdev # the optional '-X' allows X-windows support <build@hgwdev> cd $WEEKLYBLD
- make sure you are on master branch
<build@hgwdev> screen # use screen if you wish <build@hgwdev> git checkout master <build@hgwdev> git pull
- edit buildEnv.csh: change the 5th line then the 4th line
<build@hgwdev> vi buildEnv.csh # use your preferred editor < setenv LASTREVIEWDAY 2012-06-12 # v269 preview > setenv LASTREVIEWDAY 2012-07-03 # v270 preview # and < setenv REVIEWDAY 2012-07-03 # v270 preview > setenv REVIEWDAY 2012-07-24 # v271 preview
- re-source buildEnv.csh and check that vars are correct
<build@hgwdev> source buildEnv.csh # or just restart your shell windows <build@hgwdev> env | egrep "VIEWDAY"
- commit the changes to this file to Git. For TICKETNUM, use the chatter ticket of the previous build:
<build@hgwdev> @ NEXTNN = ( $BRANCHNN + 1 ) <build@hgwdev> git commit -m "v$NEXTNN preview1, refs #TICKETNUM" buildEnv.csh <build@hgwdev> git push
- run doNewReview.csh
<build@hgwdev> ./doNewReview.csh # review the variables
- run for real and direct output to a log file (this takes about 2 minutes - it runs git reports by ssh'ing to hgwdev)
<build@hgwdev> time ./doNewReview.csh real >& logs/v${NEXTNN}.doNewRev.log <build@hgwdev> ctrl-a, d # to detach from screen <build@hgwdev> source buildEnv.csh # just to be sure <build@hgwdev> @ NEXTNN = ( $BRANCHNN + 1 ) <build@hgwdev> tail -f logs/v${NEXTNN}.doNewRev.log # see what happens
Check the reports
- The reports are automatically built by the script into this location.
Press Ctrl-R to refresh your browser. Briefly review the reports quickly as a sanity check.
Generate review pairings
(Clay takes care of this)
- Assign code-review partners in redmine.
Preview2 Day Build : Day 8
This is day 8 in the schedule.
Run Git Reports
- Connect as "
build
" tohgwdev
. Then go to the weekly build dir on dev
hgwdev> ssh -X build@hgwdev # the optional '-X' allows X-windows support <build@hgwdev> cd $WEEKLYBLD
- make sure you are on master branch
<build@hgwdev> screen # use screen if you wish <build@hgwdev> git checkout master # or just: git branch <build@hgwdev> git pull
- edit buildEnv.csh: change the 7th line then the 6th line
<build@hgwdev> vi buildEnv.csh # use your preferred editor < setenv LASTREVIEW2DAY 2012-06-19 # v269 preview2 > setenv LASTREVIEW2DAY 2012-07-10 # v270 preview2 # and < setenv REVIEW2DAY 2012-07-10 # v270 preview2 > setenv REVIEW2DAY 2012-07-31 # v271 preview2
- re-source buildEnv.csh and check that vars are correct
<build@hgwdev> source buildEnv.csh # or just restart your shell windows <build@hgwdev> env | grep 2DAY
- commit the changes to this file to Git, replacing TICKETID with the Redmine ticket number for the build chatter ticket.
See email subject "CGI Release Chatter":
<build@hgwdev> @ NEXTNN = ( $BRANCHNN + 1 ) <build@hgwdev> git commit -m "v$NEXTNN preview2, refs #TICKETID" buildEnv.csh <build@hgwdev> git push
- run doNewReview2.csh
<build@hgwdev> ./doNewReview2.csh # review the variables
- ! # ! REMEBER TO RUN THE preview2TablesTestRobot.csh after this build is done ! # ! See below.
- run for real and direct output to a log file (this takes about 2 minutes - it runs git reports by ssh'ing to hgwdev)
<build@hgwdev> ./doNewReview2.csh real >& logs/v$NEXTNN.doNewRev2.log
Check the reports
- The reports are automatically built by the script into this location.
Press Ctrl-R to refresh your browser. Briefly review the reports quickly as a sanity check.
Run the tables test robot
(Takes 1 hour 40 minutes) <build@hgwdev> time ./preview2TablesTestRobot.csh >& logs/v$NEXTNN.preview2.hgTables.log <build@hgwdev> ctrl-a, d # to detach from screen <build@hgwdev> source buildEnv.csh # just to be sure <build@hgwdev> @ NEXTNN = ( $BRANCHNN + 1 ) <build@hgwdev> tail -f logs/v$NEXTNN.preview2.hgTables.log
Generate review pairings
(Clay takes care of this)
- Assign code-review partners in redmine.
Final Build : Day 15
This is day 15 in the schedule.
Optionally Check TrackDb
- [Not necessary] Optionally, as your regular user, first check that trackDb builds successfully:
<user@hgwdev> cd ~/kent/src/hg/makeDb/trackDb <user@hgwdev> make beta &> make.strict.log <user@hgwdev> /bin/egrep -iv "html missing" make.strict.log | /bin/egrep -i "error|warn" make.strict.log | grep -v ignored | wc -w # if anything other than zero there were problems, look at make.strict.log and send email # to get the person who broke hgwbeta to fix it <user@hgwdev> rm make.strict.log
Do the Build
- Connect as "
build
" tohgwdev
. Then go to the weekly build dir on dev
hgwdev> ssh -X build@hgwdev # the optional '-X' allows X-windows support <build@hgwdev> cd $WEEKLYBLD
- At this point, it might be a good idea to confirm that the build account is still able to log in to the servers we plan to use. Some of these scripts aren't great about failure detection, and in any case a manual resume is more work. An ounce of prevention and all that.
<build@hgdev> ./checkLogins.sh
This currently tests that the stored SSH keys are up-to-date for genome-euro and hgdownload, using those names (i.e. the ones used in the build scripts). If there is an error in connecting to a server, fix that first before moving on with the wrap-up.
- make sure you are on master branch and there is no uncommitted script change
<build@hgwdev> screen # NOTE: screen is recommended this time! <build@hgwdev> git checkout master <build@hgwdev> git status <build@hgwdev> git pull
- edit the buildEnv.csh file
<build@hgwdev> vi buildEnv.csh # use your preferred editor < setenv LASTWEEK 2012-06-26 # v269 final > setenv LASTWEEK 2012-07-17 # v270 final # and < setenv TODAY 2012-07-17 # v270 final > setenv TODAY 2012-08-07 # v271 final # and the big one: < setenv BRANCHNN 270 > setenv BRANCHNN 271
- re-source buildEnv.csh and check that vars are correct
<build@hgwdev> source buildEnv.csh # or just restart your shell windows <build@hgwdev> env | egrep "DAY|NN|WEEK"
- commit the changes to this file to Git. Note that you will need to fetch the build ticket number from Redmine first. It should be CGI Build ticket labeled "vNNN CGI Release Chatter" - it will be in the UCSC project in Redmine, not the GB project.
<build@hgwdev> git commit -m "v$BRANCHNN final build, refs #NNNNN" buildEnv.csh <build@hgwdev> git push
- run doNewBranch.csh
<build@hgwdev> ./doNewBranch.csh # review the variables
- run for real, send the output to a file and review while it is written (takes ~1 hour)
<build@hgwdev> ./doNewBranch.csh real >& logs/v${BRANCHNN}.doNewBranch.log <build@hgwdev> ctrl-a, d # to detach from the screen <build@hgwdev> source buildEnv.csh # just to be sure <build@hgwdev> tail -f logs/v${BRANCHNN}.doNewBranch.log # follow what happens
- look for files that tell you it was successful (script will report whether these files were created):
<build@hgwdev> ls -l /cluster/bin/build/scripts/GitReports.ok
- Check timestamp of CGIs in /usr/local/apache/cgi-bin-beta
<build@hgwdev> ls -ltr /usr/local/apache/cgi-bin-beta
- Check the version number in the hgwbeta browser title header.
https://hgwbeta.soe.ucsc.edu/cgi-bin/hgTracks
- If you get errors it might be because the script is wrong, rather than there actually be an error. For example, to check for errors, the 'make' log file is grepped for 'error|warn' so any new 'C' file with error or warn in its name will show up as an error whether or not it compiled cleanly. You might need to change the script to remove references to files like this, eg edit buildBeta.csh to ignore references to files like gbWarn.c and gbWarn.o in the log:
<build@hgwdev> edit buildBeta.csh ... make beta >& make.beta.log # These flags and programs will trip the error detection sed -i -e "s/-DJK_WARN//g" make.beta.log sed -i -e "s/-Werror//g" make.beta.log #-- report any compiler warnings, fix any errors (shouldn't be any) #-- to check for errors: set res = `/bin/egrep -i "error|warn" make.beta.log | /bin/grep -v "gbWarn.o -c gbWarn.c" | /bin/grep -v "gbExtFile.o gbWarn.o gbMiscDiff.o"`
- What the doNewBranch.csh script does:
- edits the versionInfo.h file
- makes tags (takes 1 minute)
- builds Git reports (takes 1 minute)
- does build (takes 5-10 minutes)
- builds utils (of secondary importance)
- builds CGIs (most important)
Check the reports
- The reports are automatically built by the script into this location.
- Press Ctrl-R to refresh your browser.
- Briefly review the reports quickly as a sanity check.
Run the Robots
- [build-meister] run doRobots.csh, and watch the log if you are interested (most log messages go to the logs/ dir mentioned below)
ssh -X build@hgwdev <build@hgwdev> screen # startup a new screen <build@hgwdev> ./doRobots.csh >& logs/v$BRANCHNN.robots.log <build@hgwdev> ctrl-a, d # detach from screen <build@hgwdev> source buildEnv.csh # just to be sure <build@hgwdev> tail -f logs/v$BRANCHNN.robots.log
- What the doRobots.csh script does:
- runs robots one at a time
- hgNear (20 min)
- hgTables (several hours)
- TrackCheck (several hours)
- LiftOverTest (quick)
- [push shepherd] Review the error logs for the robots:
error logs located here: hgwdev:/cluster/bin/build/scripts/logs
- hgNear -- sends email with results
- hgTables -- send email with results
- TrackCheck -- must check by hand: grep -i "error" logs/v$BRANCHNN.TrackCheck.log (TrackCheck person does this)
- LiftOverTest -- must check by hand: cat logs/v$BRANCHNN.LiftOverTest.log
NOTE: These robot tests take more than 6 hours, do not wait for them, do the rest of the steps like GBIB.
Genome Browser in a Box
Notes by Max: First of all, changes to GBIB are never made manually through SSH but through modifications of the updateBrowser.sh script. In this way, all running GBIBs get it. The only change the buildmeister makes is to run this update script to make all modifications to the current GBIB image.
- The
build
account onhgwdev
operates this procedure
ssh -X build@hgwdev <build@hgwdev> cd $WEEKLYBLD
The manual procedure is:
- Start the browser box:
<build@hgwdev> VBoxHeadless -s browserbox &
- login to the box:
<build@hgwdev> ssh box
- wait for rsync updates to finish *and* any dpkg unattended_upgrades. Look for 'sleep' commands, dpkg and sync:
<browser@browserbox> ps -ef | egrep -i "sleep|dpkg|sync" | grep -v grep
- update the update script updateBrowser.sh with command defined as an shell alias:
<browser@browserbox> gbibCoreUpdateBeta
- su to root account
<browser@browserbox> su - # password "browser"
- run an update to get an new required tables downloaded:
<root@browserbox> ./updateBrowser.sh notSelf
- run the update script to get the new CGIs:
<root@browserbox> ./updateBrowser.sh hgwdev galt beta
- NOTE: replace "galt" with your username;
- It logs in as you rsync CGIs directly from remote machines like hgwbeta.
- System may reboot with OS upgrades, after reboot, run the same updateBrowser.sh again.
DO NOT FORGET
- IMPORTANT this file needs to be removed manually
- This could be fixed in the updateBrowser.sh script by using an eval on the rsync statements.
<root@browserbox> rm /usr/local/apache/cgi-bin/hg.conf.private
- after updateBrowser.sh has run successfully to completion without reboot, can now continue with the packaging
- logout from box:
<root@browserbox> exit <browser@browserbox> exit
- (can take 25 minutes for boxRelease.csh)
<build@hgwdev> time ./boxRelease.csh beta >& logs/v${BRANCHNN}.boxRelease.log <build@hgwdev> cp -p /usr/local/apache/htdocs/gbib/gbibBeta.zip /hive/groups/browser/vBox/gbibV${BRANCHNN}.zip
- The script procedure does not function correctly because the updateBrowser.sh script now performs OS updates which require one or more reboots to complete. The buildGbibAndZip.csh script, runs the commands as the build user. The steps are archived for posterity but do not work correctly:
- It starts the browserbox vm if it is not already running.
- It updates the box, during which it rsyncs from gbib to hgwdev using a temporary public key.
- It builds a release gbib.zip and also saves it with the the current version to a backup location.
- time ./buildGbibAndZip.csh >& logs/v${BRANCHNN}.boxRelease.log (takes 20 minutes)
- Examine log for errors:
- less logs/v${BRANCHNN}.boxRelease.log
- If it gets a lot of errors, this is often due to it having the vm update and reboot itself, which kills the ssh connection that buildGbibAndZip.csh is trying to use. It does not do a great job of detecting the problem. However, it often fails withing 80 seconds. You can just re-run buildGbibAndZip.csh again as above, and check that it succeeded. It should take 20 minutes to run when working normally.
- END OF ARCHIVED SCRIPT PROCEDURE
Update and Restart qateam beta GBiB
As user "build" we have access to $WEEKLYBLD and $BRANCHNN env variables.
ssh -X build@hgwdev # this may have already been done
This script may or may not update the qateam browser box.
Run a script to automatically stop the old browserboxbeta, unregister it, mv old to backup name, unzip a fresh copy from browserbox, rename it to browserboxbeta, re-register it, and restart it.
ssh -X qateam@hgwdev $WEEKLYBLD/updateBrowserboxbeta.csh $BRANCHNN >& logs/v${BRANCHNN}.updateBrowserboxbeta.log
I use the following manual steps:
ssh qateam@hgwdev # To see what may be running: VBoxManage list runningvms "browserbox" {8e474be8-2808-466a-930d-5e6670ca1cb1} # or, view all VMs: VBoxManage list vms "browserboxalpha" {9442ad82-5672-4ea1-aeab-b40fc9ae691f} "browserboxbeta" {8e474be8-2808-466a-930d-5e6670ca1cb1} # To stop the betabox: VBoxManage controlvm browserboxbeta acpipowerbutton # if you get an error about unregister because the box is locked, find the process # using the vm and kill it, then proceed with the unregister: # ps -ef | grep boxbeta # run a kill command with the pid from the ps output # run the unregister command again # to unregister the betabox: VBoxManage unregistervm browserboxbeta # reorganize directories cd "VirtualBox VMs" mv browserboxbeta browserboxbeta.v${BRANCHNN} # BRANCHNN does not exist on qateam account, also remember that this is the # OLD BRANCHNN value, as it's a preservation of the previous build's beta box. mkdir browserboxbeta cd browserboxbeta unzip /usr/local/apache/htdocs/gbib/gbibBeta.zip # change the ports to use and the VM image name: sed -e 's/1234/1236/; s/1235/1237/; s/browserbox/browserboxbeta/;' \ browserbox.vbox > browserboxbeta.vbox # to register this new image: cd VBoxManage registervm `pwd`/"VirtualBox VMs/browserboxbeta/browserboxbeta.vbox" # To start this betabox: nice -n +19 VBoxHeadless -s browserboxbeta & # to login to the betabox: (wait a few moments for it to get fully started) ssh -p 1237 browser@localhost
- test WEB server and login account:
Does it seem to return the index.html ok?
wget http://localhost:1236/index.html -O /dev/stdout htmlCheck getAll http://localhost:1236/index.html
Do you see the correct CGI version?
ssh -X qateam@hgwdev "wget http://localhost:1236/cgi-bin/hgTracks -O /dev/stdout | grep '<TITLE'" htmlCheck getAll http://localhost:1236/cgi-bin/hgTracks | grep '<TITLE'
Can you login to the vm interactively?
ssh -X qateam@hgwdev ssh boxBeta # use password 'browser' to login - uses .ssh/config to get the port. exit # exit from vm exit # exit from qateam
If you want to be sure that the vm has been correctly updated, has the right version, or confirm that the latest patch is working there, you can browse on browserboxbeta vm which is on hgwdev on port 1236 via SSH tunnel from you local machine on port 9991:
Open a new terminal window on your local machine.
Windows: # change to your own username
plink.exe -N -L 127.0.0.1:9991:127.0.0.1:1236 galt@hgwdev.gi.ucsc.edu
Linux:
ssh -N -L 127.0.0.1:9991:127.0.0.1:1236 $USER@hgwdev.gi.ucsc.edu
Open web browser:
http://127.0.0.1:9991/
On your local machine terminal window, press control-c to terminate ssh or plink.
Update kent-core
- update kent-core as yourself, not the build user.
TODO maybe this should just be run at Final Build Wrap-up? TODO make a build service account at github that would be able to support this.
- Make sure your ssh keys are set up, this command should work:
ssh -i .ssh/github -T git@github.com
- If that doesn't work, follow these directions to get your ssh keys setup with your github account: https://docs.github.com/en/authentication/connecting-to-github-with-ssh/generating-a-new-ssh-key-and-adding-it-to-the-ssh-agent
- Note that if you have multiple ssh keys for your hgwdev user (likely), you can tell ssh to use a specific key with a .ssh/config file:
cat ~/.ssh/config Host github.com IdentityFile ~/.ssh/github chmod 600 ~/.ssh/config
- If you haven't done it yet, clone this Github repo into your ${HOME}/kent-core by doing:
cd $HOME git clone git@github.com:ucscGenomeBrowser/kent-core.git
- go to the top-level kent (note kent-core) directory of the tree and run this:
cd $HOME cd kent-core git pull cd /hive/groups/browser/newBuild/kent build/kent-core/makeKentCore
- the command will update ~/kent-core with selected files and then suggest a git commit command for you to update it
- this git command can be added to build/kent-core/makeKentCore in the future, just keeping this step manual for now, so we can test the procedure (Mar 2023).
- It does NOT handle subversions yet, if you have multiple patches and therefore kent-core tags made.
Generate the code summaries and review pairings
(Clay takes care of this)
- Assign code-review partners in redmine.
(Lou Takes care of this)
- Summarize the code changes that were committed during the past week. Solicit input from the engineers.
- Update these pages with the summary:
- Send an email to browser-staff with links to the summaries.
Test on hgwdev
- Wait to hear from QA about how their CGIs look on hgwbeta. QA members should update the CGI build chatter ticket in Redmine with a "done testing" message or, if applicable, "not following issues for this release" message. Each member of the QA team has testing responsiblities.
Make changes to code base as necessary
This happens on days 15, 16, 17, and 18 in the schedule.
- If there are problems with the build a developer will fix the code. This fix needs to be patched into the build on hgwdev. This page explains how to do a Cherry Pick on hgwdev.
Fixing problems in the Build
This usually happens between days 16 and 19.
QA advises buildmeister to cherry pick
- see these instructions.
Push the CGIs
This is day 22 in the schedule.
The day before the push (day 21 in the schedule) send email notice
Send email to all of browser-staff (which includes cluster-admin) letting them know that tomorrow is a push day. Something along these lines:
Just a heads up that tomorrow is a CGI push day. If you have big code changes included in this release please be available in case something goes wrong with the push of your changes. QA typically starts the push around 1:30pm.
Push to hgw0 only
- hgw0 is identical to the RR machines but not actually in the RR (i.e. changes there are not seen by the public).
- QA will send an email to push-request the morning of the push letting the pushers know that today is a CGI push day (this is their notice to be vigilant about pushing quickly).
- QA will ask for push of CGIs from hgwdev to hgw0 only. IF there is a NEW CGI or file(s) going out this week, be sure to make a prominent note of it in your push request. The admins push from a script, and they will need to add your new CGI to the script. (the build-meister should not be cc'd on this email.)
As of April 2019, here's a list of the CGIs and data files we push. Note: CGIs and data files may have been added since this list was created -- this is meant to be a starting point.
cartDump cartReset das hgApi hgBeacon hgBlat hgc hgCollection hgConvert hgCustom hgEncodeApi hgEncodeDataVersions hgEncodeVocab hgFileSearch hgFileUi hgGateway hgGene hgGeneGraph hgGenome hgGtexTrackSettings hgHubConnect hgIntegrator hgLinkIn hgLiftOver hgLogin hgMenubar hgMirror hgNear hgPal hgPcr hgPublicSessions hgRenderTracks hgSession hgSuggest hgTables hgTracks hgTrackUi hgUserSuggestion hgVai hgVisiGene hubApi phyloPng hgPhyloPlace hgPhyloPlaceData
and these configuration files:
/usr/local/apache/cgi-bin/all.joiner /usr/local/apache/cgi-bin/extTools.ra /usr/local/apache/cgi-bin/greatData/* /usr/local/apache/cgi-bin/hgCgiData/* /usr/local/apache/cgi-bin/hgGeneData/* /usr/local/apache/cgi-bin/hgNearData/* /usr/local/apache/cgi-bin/hgcData/* /usr/local/apache/cgi-bin/loader/* /usr/local/apache/cgi-bin/lsSnpPdbChimera.py /usr/local/apache/cgi-bin/visiGeneData/* /usr/local/apache/cgi-bin/pyLib/*
For these directories we request an rsync --delete (from hgwbeta to the RR)
/usr/local/apache/htdocs/js/* /usr/local/apache/htdocs/style/*
- Run TrackCheck on hgw0. This is the responsibility of the QA person who tests hgTracks.
- make a props file which specifies the machine/db to check. Set zoomCount=1 and it will only check the default position for each assembly. Example props file for hgw0:
machine mysqlbeta.soe.ucsc.edu #This is where TrackCheck checks for active databases (active=1). These databases may not be on the the RR and it will give errors which can be ignore.
server hgw0.soe.ucsc.edu # this is the machine that you are testing.
quick false
dbSpec all #You can list just one database here.
table all #You can list one table if need be.
zoomCount 1 # if number is greater than one it will check links at higher zoom levels.
- run it from hgwdev:
nohup TrackCheck hgw0.props > & $WEEKLYBLD/logs/TrackCheck-hgw0.07-13-2006
- run in the background if desired by typing Ctrl-Z then "
bg
", to check status type "jobs
" or "ps -ef | grep TrackCheck
".
- run in the background if desired by typing Ctrl-Z then "
- examine the file for errors.
- Monitor Apache Error Log (QA does this) see examples here:
hgw0:/usr/local/apache/logs/error_log
To watch the log without line wraps, type "less -S error_log." Typing capital "F" will all you to follow incoming errors. When errors arrise you can type Ctrl-C and use the right arrow to scroll the window over to see the entire message. *Update*: To view the error log *without* Hiram's CGI_TIME entries (for background info see: http://redmine.soe.ucsc.edu/issues/10081):
$ tail -f error_log | grep -v CGI_TIME
- Wait to hear from QA about how their CGIs look on hgw0. Each member of the QA team has testing responsiblities. Check also that TrackCheck ran successfully.
Push to hgwN only
- hgwN is one of the RR machines, hgw1-6. Each build, rotate to the next machine in numeric order i.e. hgw1 then hgw2 etc. so that one machine is not being worked more than the others.
- Once the new CGIs are on hgwN the push shepherd will watch the error logs for a short while to make sure no new errors occur under load.
Push to the rest of the RR and hgwbeta-public and euronode
- QA will ask for push from hgwbeta to the rest of the hgwN machines, as well as hgwbeta-public and euronode. The js and style directory files should also go to /usr/local/apache/htdocs/js-public/* (or style-public/*) on hgwbeta ONLY in order to keep the javascript the same on the RR and hgwbeta-public. So, in addition to asking for the rsync --delete of the directories from hgwbeta to the RR machines, we also need to ask for an rsync --delete:
from /usr/local/apache/htdocs/js/* /usr/local/apache/htdocs/style/* (on hgwbeta) to /usr/local/apache/htdocs/js-public/* /usr/local/apache/htdocs/style-public/* (on hgwbeta ONLY)
- QA will send email to the build-meister to let him/her know that the CGIs are on the RR.
Remember to keep track of new features
Anyone can add to this list at any time, but if no notes for this release have been made on the new features page, now is a good time to add some.
Final Build Wrap-up
This is day 23 in the schedule.
The buildmeister should do these steps once QA has notified you that all RR machines have been updated.
Normally this is run once at the end of the cycle. However, occassionally it is necessary to patch a build after it is already released on the RR. Depending upon the extent of the patch, it may be desirable or even necessary to rerun the wrap-up. All of these scripts can be safely rerun until the next build is made (until BRANCHNN is updated in buildEnv.csh).
- Connect as "
build
" onhgwdev
this time. Then go to the weekly build dir on dev
hgwdev> ssh -X build@hgwdev # the optional '-X' allows X-windows support <build@hgwdev> cd $WEEKLYBLD
- build and push hgcentral IF there are any changes
<build@hgwdev> ./buildHgCentralSql.csh >& logs/v${BRANCHNN}.buildHgCentralSql.log <build@hgwdev> cat logs/v${BRANCHNN}.buildHgCentralSql.log <build@hgwdev> ./buildHgCentralSql.csh real >>& logs/v${BRANCHNN}.buildHgCentralSql.log <build@hgwdev> echo $status
- check that the hgcentral.sql has been updated:
http://hgdownload.soe.ucsc.edu/admin/
- build 'userApps' target (various utilities) on hgwdev and scp them to hgdownload
<build@hgwdev> screen # if desired <build@hgwdev> time ./doHgDownloadUtils.csh >& logs/v${BRANCHNN}.doHgDownloadUtils.log <build@hgwdev> echo $status
(takes 45 minutes)
- check that dates on the utils to verify they were updated:
http://hgdownload.soe.ucsc.edu/admin/exe/linux.x86_64 http://hgdownload.soe.ucsc.edu/admin/exe/linux.x86_64/blat/
- update the beta tag to match the release:
<build@hgwdev> cd $WEEKLYBLD <build@hgwdev> env | egrep "BRANCHNN|TODAY|LASTWEEK" # (just to make sure it looks right) <build@hgwdev> ./tagBeta.csh >&logs/v${BRANCHNN}.tagBeta.log <build@hgwdev> cat logs/v${BRANCHNN}.tagBeta.log <build@hgwdev> ./tagBeta.csh real >>&logs/v${BRANCHNN}.tagBeta.log <build@hgwdev> echo $status
- if fails because very large thing during the release, temporarily increase limit in kent repo, finished the steps from tagBeta.csh manually, starting at this step: git push origin beta
- Be sure to restore the limit in kent repo when done.
- tag the official release
<build@hgwdev> cd $WEEKLYBLD <build@hgwdev> git fetch <build@hgwdev> git tag | grep "v${BRANCHNN}_branch" # Note # if the output is empty, that is fine it just means none have been created yet, so make branch.1 # otherwise use the next available branch .2 or whatever is the next unused subversion number <build@hgwdev> git push origin origin/v${BRANCHNN}_branch:refs/tags/v${BRANCHNN}_branch.1 <build@hgwdev> git fetch
- zip the source code
<build@hgwdev> cd $WEEKLYBLD <build@hgwdev> time ./doZip.csh >&logs/v${BRANCHNN}.doZip.log # this is automatically pushed to hgdownload <build@hgwdev> echo $status
(takes 4 minutes)
- check that the source code .zip files were updated:
http://hgdownload.soe.ucsc.edu/admin/
- WAIT 10 minutes, then run ./userApps.sh to package up the userApps/ directory with its source and pushes it to hgdownload htdocs/admin/exe/
time ./userApps.sh >& logs/v${BRANCHNN}.userApps.log
(takes 1 minute)
- check that dates on the userApps src.tgz to verify they were updated:
http://hgdownload.soe.ucsc.edu/admin/exe
- The macOSX binaries can now also be constructed
# on your Mac laptop, you should have a kent source tree on the beta branch cd $HOME/kent/src/userApps git branch # * beta # master # verify the git pull brings in the next version, check before and after pull grep CGI ../hg/inc/versionInfo.h git pull grep CGI ../hg/inc/versionInfo.h # should see the new version number # remove the bin directory and update this source and build new binaries rm -fr bin make update # the constructed index page needs to get to hgwdev: scp -p kentUtils.Documentation.txt yourUserName@hgwdev.gi.ucsc.edu:/tmp/ # one of the binaries always makes an extra unneeded directory find ./bin -type d # ./bin # ./bin/bedToExons.dSYM # ./bin/bedToExons.dSYM/Contents # ./bin/bedToExons.dSYM/Contents/Resources # ./bin/bedToExons.dSYM/Contents/Resources/DWARF # do not need the bedToExons.dSYM directory rm -fr bin/bedToExons.dSYM # the blat binaries need to be in their own directory mkdir bin/blatexe mv bin/blat bin/blatexe mv bin/blatexe bin/blat mv bin/gfServer bin/blat mv bin/gfClient bin/blat # this bin directory can be copied to hgwdev cd bin rsync -a -P ./ yourUserName@hgwdev.gi.ucsc.edu:/usr/local/apache/htdocs-hgdownload/admin/exe/macOSX.x86_64/ # check that dynamic libraries make sense. The command on Mac OSX to show # dynamic libraries is: otool -L which can be aliased to: alias ldd='otool -L' thus: ldd ./hgsql # /usr/lib/libc++.1.dylib (compatibility version 1.0.0, current version 400.9.4) # /usr/lib/libSystem.B.dylib (compatibility version 1.0.0, current version 1252.250.1)
# Now, back on hgwdev, # verify the change in the index page makes sense cd /usr/local/apache/htdocs-hgdownload/admin/exe/macOSX.x86_64 diff FOOTER.txt /tmp/kentUtils.Documentation.txt # extra credit, update README.txt to indicate this release version and note the dynamic # libaries and Mac OSX version that was used # if that is reasonable, it can replace FOOTER.txt cp -p /tmp/kentUtils.Documentation.txt ./FOOTER.txt # copy to hgdownload to release these binaries: rsync -a -P ./ qateam@hgdownload.soe.ucsc.edu:/mirrordata/apache/htdocs/admin/exe/macOSX.x86_64/ # take a look there to see if it makes sense #
- Push to the genome browser store from hgwdev
ssh hgwdev #open a new terminal and login to hgwdev as yourself, not the build user. <you@hgwdev> sudo /cluster/bin/scripts/gbib_gbic_push <you@hgwdev> exit # close the terminal window
- Create a Release on github for the release tag. To start, we can do this by visiting https://github.com/ucscGenomeBrowser/kent/releases/new, selecting the vXXX_branch.X tag, and marking that tag as a release on the master branch. Consider linking to https://genecats.gi.ucsc.edu/builds/versions.html in the release description, or else copying the same text you'll use to describe the release notes in the genome-mirror email (below). In the future we can try installing the "gh" command-line tool to interface with github, then run something like "gh release create v467_branch.1".
- WAIT a day for the nightly rsync to happen from the RR for cgi-bin/ and htdocs/ hierarchies to hgdownload
- confirm cgi-bin/ and htdocs/ on hgdownload are up to date,
ftp://hgdownload.soe.ucsc.edu/apache/cgi-bin/ ftp://hgdownload.soe.ucsc.edu/apache/htdocs-rr/
Since browsers do not like ftp much anymore, you can still view those ftp directories on Windows File Explorer. Note that when you rsync from hgdownloads, htdocs/ is mapped to htdocs-rr/ internally.
- send email to genome-mirror@soe.ucsc.edu.
- Refer to this form making a summary summary:
https://genecats.gi.ucsc.edu/builds/versions.html
Include this link to latest source: http://hgdownload.soe.ucsc.edu/admin/jksrc.zip. Use the last email as a template (see https://www.soe.ucsc.edu/pipermail/genome-mirror). If you push the hgcentral.sql, make sure to mention this has also changed in the email. If you're having trouble formatting the CGI name list for 80 columns, the UNIX fold -s command should help.
Example:
To: genome-mirror@soe.ucsc.edu Subject: v457 Genome Browser Available Good Afternoon Genome Browser Mirror Site Operators: Version v457 of the UCSC Genome Browser software has now been released. The main changes are: Added (i) icons to hgLiftOver, to explain the various obscure options without so much text on the page. Note that this uses the new printInfoIcon(text) function, which we can use on many CGIs. Try again: users can now query old NCBI transcripts, both by name and in hgvs coordinates, for hg38 only. Fixed problems with RTS load where composite and supertrack children were stuck in positions set by previous RTS loads.. hgSearch now allows clicking the text to show/hide results instead of just the small button. Made more improvements to tooltips: left click or contextmenu click closes tooltip, fix tooltip y positioning when it would normally be above the viewport, left justify the tooltip text. Documented "accession1;accession2" query in the position box. Snp hgc pages now use mitochondrial codon table. Fixed never-used multi-term searches that each resolves to a singlePos structure to show a range with each item highlighted. Fixed problems with forcing bigBed filtering to pass items found with hgFind. Staged work in progress hgc code for creating a table of links to associated chains for the chainHRPC tracks. Made various changes for Chinese internet rules. Now using forceTwoBit by default, no more .nib files are needed. Fixed OMIM tracks error on GBIB and GBIC, no redmine, various emails. nomAD v4 VCF track for hg38. For a comprehensive list of changes for this version, please visit/bookmark (you may need to force refresh in your web browser to see an updated page): https://genecats.gi.ucsc.edu/builds/versions.html We typically release a new version every three weeks. A summary of all past releases and changes can be found at the above link. The new source code tree is available at: http://hgdownload.soe.ucsc.edu/admin/jksrc.zip or labeled with version number at: http://hgdownload.soe.ucsc.edu/admin/jksrc.v457.zip or in our Github repository in the "beta" branch and also tagged with the version number. If you use the GBIB virtual machine with auto-updates enabled, it will automatically update itself on Sunday. If you have installed your UCSC Genome Browser server with the GBIC installation script browserSetup.sh, then you can upgrade it with the command "sudo bash browserSetup.sh update". If you have installed your Genome Browser manually: - You can use the following rsync command to copy the CGI binaries into the cgi-bin directory of your Apache server: rsync -avP rsync://hgdownload.soe.ucsc.edu/cgi-bin/ ${WEBROOT}/cgi-bin/ - We provide a shell script to update the htdocs directory, htdocs and cgi-bin must be updated together: https://github.com/ucscGenomeBrowser/kent/blob/master/src/product/scripts/updateHtml.sh - The hgcentral database contains our curated list of public track hubs and pointers to BLAT servers per database and should be updated regularly. We have released a new MySQL dump of this database: http://hgdownload.soe.ucsc.edu/admin/hgcentral.sql A license is required for commercial download and/or installation of the Genome Browser binaries and source code. No license is needed for academic, nonprofit, and personal use. More information on our licensing page: http://genome.ucsc.edu/license/ If you have any questions or concerns, please feel free to write back to this genome-mirror mail list. Galt Barber UCSC Genomics Institute