Added new ANT build files

This commit is contained in:
CekisSWG
2018-12-28 02:36:09 +00:00
parent d66e44c6b9
commit ce657b4320
6 changed files with 1048 additions and 91 deletions

1
.gitignore vendored
View File

@@ -7,6 +7,7 @@ dsrc/
dependencies/
exe/
src/
miff/
!stationapi/src/
stationapi/build/
stationapi/externals/

150
README.md Normal file → Executable file
View File

@@ -1,7 +1,8 @@
# SWGSource V1.2 Build Instructions
## Credit
Credit the StellaBellum team (DarthArgus, Cekis, SilentGuardian, Ness, SwgNoobs, DevCodex) for making their repositories open. All source is forked from those repositories and progressed from that point.
Credit the StellaBellum team (DarthArgus, Cekis, SilentGuardian, Ness, SwgNoobs, DevCodex) for making their repositories open. All source is forked from
those repositories and progressed from that point.
## What Do You Need To Do To Get A Server Running?
@@ -16,91 +17,46 @@ When the GIT repository has been cloned successfully, open the swg-main director
cd /home/swg/swg-main
Create a .setup file such that the build script knows where to start:
touch .setup
## Building
The process is nearly fully automated by executing the build_linux.sh script located in swg-main. The script will ask you a series of questions with each
question corresponding to a different step in the build process. Each step is defined below. To complete building, kick off the build script and follow
these instructions as detailed here to understand each step.
### Requirements
- Java 8 (1.8_u101)
- Apache ANT 1.9+
First and foremost, you'll need to install Apache ANT (at least version 1.9) on your VM. ANT is required for the build process to run successfully. ANT
will be included in the next VM build, but for now here are the steps to do so manually:
1. Go to https://ant.apache.com/download and download the latest version of ANT (1.10 something as of this writing).
2. Expand the ANT package (.zip or .tar) into your VM (or server) directories somewhere. Take note of the location where you expanded it.
3. Edit your .profile and add a line that sets the location where you expanded it as ANT_HOME.
4. While you're editing your profile, make sure that JAVA_HOME is set to the right spot too. You can figure out where Java is installed by using the
"which" command: ```which java```
5. Save your .profile edits and test out that ANT_HOME is installed properly by typing in "ant" at any location on your command line. You should get a
non-standard error message about how build.xml is missing. If JAVA_HOME isn't set correctly, you'll get an error about that too.
### Starting the Build Process
Follow the instructions of the build script. The binary building phase will take roughly 1 hour for each CPU core assigned to your VM. Do not skip any steps.
Execute the following script to start building (see detail on each step below this):
To complete building, kick off the build script from your swg-main directory by typing in: ```ant swg```
./build_linux.sh
The build process is fully configured in the build.properties file. There is no need to touch this file unless you have a fully customized version that you
would like to run. For starters, just don't worry about touching it.
You can also run sections of the build script manually (not recommended until you are used to the environment of which you're working in).
#### Cloning the Repositories
The script will first ask if you would like to retrieve any GIT folders necessary (it knows which ones it needs to get). It will attempt to get the pre-configured
repositories that are outlined in the build script by cloning them. It will simply do a pull if the repositories already exist to bring them up to date.
This currently includes "src", "dsrc", and others:
A very useful command is ```ant git_all_update```. It will attempt to get (or update if you already have them there) the pre-configured repositories that are
outlined in the build script by cloning them or updating them (if they already exist). It will simply do a pull if the repositories that already exist to bring
them up to date. This currently includes "src", "dsrc", and others:
Do you want to pull/update git? (y/n)
```
ant git_src
ant git_dsrc
ant git_configs
```
DID YOU KNOW?: You can specify multiple ANT targets with a single command as well (this goes for any of these targets you give it). ANT will execute them and make sure all
of their dependencies are met in the order you provide them:
If you answer "y" to the question, it will attempt to retrieve the latest version set of the code. If you hit enter or choose "n" it will skip to the next step.
### Compile Phase
After getting the latest version of the code, the build script will then attempt to build the server.
#### Compile Mode
The system first needs to know how you will be compiling the code. You have a choice of either DEBUG (d) mode or RELEASE (r) mode. If you do not enter "d" or "debug" for the mode then it is assumed you wish to build a RELEASE build. (release is recommended unless you have a specific reason to want debug.)
Is this for DEBUG mode or RELEASE mode? (d/r):
#### Compiling SRC
The first compile step is to build the server C++ code. This is the source found in the "src" folder. Once compiled, you should have compiled code in the "build" directory of swg-main:
Do you want to recompile the server code (C++) now? (y/n)
#### Building the Server Configuration Files
The next step will attempt to build the configuration files. This step captures data needed for the server to execute properly. The following information is captured in this step:
Do you want to build the config environment now? (y/n)
Enter your IP address (LAN for port forwarding or internal, outside IP for DMZ)
Enter the IP address of the host machine that will be the server. This is typically your VM IP Address.
Enter the DSN for the database connection
Enter the name of the database. This is typically just "//127.0.0.1/swg" without quotes. It uses the 127.0.0.1 localhost loopback instead of the VM IP on purpose.
Enter the database username
Enter the user name for the database connection. This is also typically just "swg" without quotes.
Enter the database password
Enter the password for the database connection. This is also typically just "swg" without quotes.
Enter a name for the galaxy cluster
This will be the cluster name of the galaxy. Use something creative (or not), this is the name of the galaxy you're going to choose when creating a new character. NOTE: Remember your cluster name because you will need it again in a later step.
#### Compiling DSRC
The next step will be to build the Java code that resides in the dsrc folder. This code is where 99% of the game content and AI resides. This step takes a long time so go watch a movie or something while it builds. Note: It will give you the option of using "multi" or "safe" building. Multi takes advantage of multi-threaded compiling (typically faster). However, it was found that using it SOMETIMES caused issues, but these issues are likely no around any longer. Needless to say, if you encounter issues using MULTI, try using Safe instead which does not use multi-threaded compiling:
Do you want to recompile the scripts (.java)? (y/n)
The Script will currently build the ENTIRE dsrc at this step. If you want to build just an individual part, please see `./build_dsrc.sh` information at the bottom of this readme.
#### Build your chat server
This step will (re)build the stationapi chat server.
Do you want to build stationapi chat server now? (y/n)
#### Importing the Database SQL
This step will attempt to create the database schema from the SQL files that are in the src directory structure. It will create all database tables, functions, views and stored procedures the game needs to use. You can run this step without fear of ruining anything that is currently in place. Keep in mind that if you do attempt to import the database over the top of an existing swg database, it will simply throw errors after a very short attempt, but will not cause issue with your current data. Basically, you can't mess this part up:
Import database? (y/n)
If you haven't previously entered the necessary database information above, it will ask you for that information here.
#### Clientdata symlink
The Clientdata Repo is in /home/swg/swg-main/clientdata. You MUST have this in place and have the symbolic link created before the server can execute successfully:
Do you want to create the symlink for the clientdata folder? (y/n)
```
ant git_src git_dsrc
```
#### Final configuration:
@@ -115,32 +71,44 @@ Point your login.cfg in your game folder (on your client machine, NOT the VM) to
AND YOU'RE DONE!
#MORE READING...
## build_dsrc.sh
You may want to build certain parts of the dsrc individually instead of all at once as done in build_linux.sh. Here is some information about the specific steps in build_dsrc.sh:
The following targets ARE NOT REQUIRED, but you may find yourself wanting to build certain parts of the dsrc individually instead of all at once as done in the
ANT build script. Here is some information about the specific targets in ANT build file:
#### Compiling the mIFF files
This step will compile all *.mif files into *.iff binary files.
The "compile_miff" target will compile all *.mif files into *.iff binary files.
Do you want to build the mIFF files (.mif)? (y/n)
```ant compile_miff```
#### Compiling the Datatable files
This step will compile all the *.tab files into *.iff binary files. Again, speed constraints aside, you have the choice of using MULTI or SAFE for compiling (see DSRC Compiling above):
The "compile_tab" target will compile all the *.tab files into *.iff binary files:
Do you want to build the datatables (.tab)? (y/n)
```ant compile_tab```
#### Compiling Template Files
This step will compile all the *.tpf files into *.iff binary files. You have a choice of Multi or Safe here as well:
The "compile_tpf" target will compile all the *.tpf files into *.iff binary files:
Do you want to build the template files (.tpf)? (y/n)
```ant compile_tpf```
If you have built the TPF files in this step (i.e. you didn't skip this step) then the build script will also attempt to recreate the Object Template and Quest CRC tables and subsequently will attempt to push those changes to the database since this will also be required. A GREAT feature to have when creating new template files or changing existing ones.
If you have built the TPF files in this step (i.e. you didn't skip this step) then the target will also attempt to recreate the Object Template and Quest CRC
tables and subsequently will attempt to push those changes to the database since this will also be required. A GREAT feature to have when creating new template files
or changing existing ones.
Again... if you wish to do a multiple of these things, you can string multiple targets together like so (not all 3 are required and they can be added in any order as
ANT handles any dependencies already):
```ant compile_miff compile_tab compile_tpf```
This particular command will first build the MIFF files, then compile the TAB files, then compile and load the Template Files into the database.
### Database Phase
#### Building Object Template and Quest CRC Files
This step will compile the object template and quest CRC files. These files translate the long name of these files (including file path) into a very short code that allows the server to identify them without the danger of long text being transferred over the internet in packets. Basically a optimization that SOE implemented:
This step will compile the object template and quest CRC files. These files translate the long name of these files (including file path) into a very short code that
allows the server to identify them without the danger of long text being transferred over the internet in packets. Basically an optimization that SOE implemented:
Do you want to build the Object Template or Quest CRC files? (y/n)
```ant load_templates```
Building these files will also trigger the build script to then populate the database with the CRC's that were generated. If you are doing this build script in pieces (i.e. you're selectively building), this is a GREAT way to re-import new or changed TPF file changes. In order to re-import CRC's into the database, if you haven't already entered it above, it will ask you for the database information here.
Building these files will also trigger the target to then populate the database with the CRC's that were generated. If you are doing this target in pieces
(i.e. you're selectively building), this is a GREAT way to re-import new or changed TPF file changes. In order to re-import CRC's into the database, if you haven't
already entered it above, it will ask you for the database information here.

25
build.properties Executable file
View File

@@ -0,0 +1,25 @@
# General
# This handles if the database will be dropped or not. Turn this on if you want to drop the db and do a "ant drop_database" from the swg-main dir.
firstrun = false
# Server Settings
server_ip_address = 192.168.0.95
cluster_name = Korriban
# Database Settings
db_username = swg
db_password = swg
db_service = swg
# Git Settings
src_repo = https://github.com/SWG-Source/src.git
dsrc_repo = https://github.com/SWG-Source/dsrc.git
configs_repo = https://github.com/SWG-Source/configs.git
clientdata_repo = https://github.com/SWG-Source/clientdata.git
src_branch = master
dsrc_branch = master
clientdata_branch = master
configs_branch = master
# SRC Compilation
src_build_type = Debug

418
build.xml Executable file
View File

@@ -0,0 +1,418 @@
<project name="SWGBuild" default="init" basedir="/home/swg/swg-main">
<import file="git_targets.xml" />
<description>
This build file will build all aspects of the SWG Source Code. Created by Cekis (cekisswg@gmail.com).
</description>
<!-- Property File -->
<property file = "build.properties"/>
<!-- Global Properties -->
<property name="build" location="${basedir}/build"/>
<!-- Database Service Name is derived to make it easier in the properties file -->
<property name="service_name" value="//${server_ip_address}/${db_service}"/>
<!-- Setup Source Directories -->
<property name="exe" location="${basedir}/exe"/>
<property name="src" location="${basedir}/src"/>
<property name="dsrc" location="dsrc" relative="true" basedir="${basedir}"/>
<property name="data" location="data" relative="true" basedir="${basedir}"/>
<property name="clientdata" location="${basedir}/clientdata"/>
<property name="configs" location="configs" relative="true" basedir="${basedir}"/>
<!-- Setup Key Game Directories -->
<property name="dsrc_server" location="${dsrc}/sku.0/sys.server/compiled/game"/>
<property name="dsrc_shared" location="${dsrc}/sku.0/sys.shared/compiled/game"/>
<property name="data_server" location="${data}/sku.0/sys.server/compiled/game"/>
<property name="data_shared" location="${data}/sku.0/sys.shared/compiled/game"/>
<property name="data_client" location="${data}/sku.0/sys.client/compiled"/>
<!-- Setup CRC Files to load into the database -->
<property name="object_crc_file" location="${dsrc}/sku.0/sys.server/built/game/misc/object_template_crc_string_table.tab"/>
<property name="templates_sql_file" location="${build}/templates.sql"/>
<!-- Define where most of our compiled tools will live -->
<property name="tools_home" location="${build}/bin"/>
<property name="bin_home" location="${exe}/linux/bin"/>
<property environment="env"/>
<!-- The init target handles the environment setup - not much to do but create directories -->
<target name="init">
<tstamp/>
<mkdir dir="${build}"/>
<mkdir dir="${data_server}"/>
<mkdir dir="${data_shared}"/>
<mkdir dir="${data_client}"/>
</target>
<target name="swg" description="builds the entire SWG codebase for the first run" depends="clean,git_all_update,update_configs,create_database,compile">
</target>
<!-- Clean simply calls the other clean targets -->
<target name="clean" depends="clean_src,clean_dsrc,init">
</target>
<!-- Delete the SRC Build folder -->
<target name="clean_src">
<echo>Cleaning the SRC build directory.</echo>
<delete dir="${build}" verbose="false"/>
</target>
<!-- Delete the DSRC Build folder -->
<target name="clean_dsrc">
<echo>Cleaning the DSRC directory.</echo>
<delete dir="${data}" verbose="false"/>
</target>
<!-- Delete the DSRC Build folder -->
<target name="clean_java">
<echo>Cleaning the DSRC script directory.</echo>
<delete dir="${data_server}/script" verbose="false"/>
</target>
<!-- Gets the architecture we're on - uses old way of getting it from original build_linux.sh script -->
<target name="get_arch">
<exec executable="arch" dir="." outputproperty="arch"/>
<condition property="compile.x86">
<equals arg1="${arch}" arg2="x86_64"/>
</condition>
<echo>Architecture is ${arch}</echo>
<condition property="is_debug_build">
<equals arg1="${src_build_type}" arg2="Debug"/>
</condition>
<echo>Creating a ${src_build_type} build</echo>
</target>
<!-- Gets the number of processors at our disposal -->
<target name="get_num_procs">
<exec executable="nproc" dir="." outputproperty="nproc"/>
<echo>We have ${nproc} processors (cores) to use.</echo>
</target>
<!-- Calls git on all of our source folders -->
<target name="git_all_update" description="go refresh all of our repos" depends="check_git_dirs,clone_src,git_src,clone_dsrc,git_dsrc,clone_clientdata,git_clientdata,clone_configs,git_configs">
</target>
<target name="check_git_dirs" description="checks and sets values if directories exist or not">
<condition property="src.exists">
<available file="${src}" type="dir"/>
</condition>
<condition property="dsrc.exists">
<available file="${dsrc}" type="dir"/>
</condition>
<condition property="clientdata.exists">
<available file="${clientdata}" type="dir"/>
</condition>
<condition property="configs.exists">
<available file="${configs}" type="dir"/>
</condition>
</target>
<target name="update_configs" description="updates the configuration files with the desired settings" if="firstrun" depends="clone_configs">
<replace dir="${exe}" propertyFile="${basedir}/build.properties">
<include name="**/*.cfg"/>
<replacefilter token="CLUSTERNAME" property="cluster_name"/>
<replacefilter token="HOSTIP" property="server_ip_address"/>
<replacefilter token="DBUSERNAME" property="db_username"/>
<replacefilter token="DBSERVICE" value="//${server_ip_address}/${db_service}"/>
<replacefilter token="DBPASSWORD" property="db_password"/>
</replace>
</target>
<!-- Calls git on our SRC folder -->
<target name="clone_src" description="go get the source from swg source" unless="${src.exists}" depends="check_git_dirs">
<git-clone-pull repository="${src_repo}" dest="${src}" branch="${src_branch}"/>
</target>
<target name="git_src" description="go get the source from swg source" if="${src.exists}" depends="check_git_dirs">
<git-pull dir="${src}" branch="${src_branch}"/>
</target>
<!-- Calls git on our DSRC folder -->
<target name="clone_dsrc" description="go get server dsrc from swg source" unless="${dsrc.exists}" depends="check_git_dirs">
<git-clone-pull repository="${dsrc_repo}" dest="${dsrc}" branch="${dsrc_branch}"/>
</target>
<target name="git_dsrc" description="go get server dsrc from swg source" if="${dsrc.exists}" depends="check_git_dirs">
<git-pull dir="${dsrc}" branch="${dsrc_branch}"/>
</target>
<!-- Calls git on our CLIENTDATA folder -->
<target name="clone_clientdata" description="go get clientdata from swg source" unless="${clientdata.exists}" depends="check_git_dirs">
<git-clone-pull repository="${clientdata_repo}" dest="${clientdata}" branch="${clientdata_branch}"/>
<mkdir dir="${exe}/linux/logs"/>
</target>
<target name="git_clientdata" description="go get clientdata from swg source" if="${clientdata.exists}" depends="check_git_dirs">
<git-pull dir="${clientdata}" branch="${clientdata_branch}"/>
</target>
<!-- Calls git on our CONFIGS folder -->
<target name="clone_configs" description="go get server configs from swg source" unless="${configs.exists}" depends="check_git_dirs">
<git-clone-pull repository="${configs_repo}" dest="${configs}" branch="${configs_branch}"/>
<copy todir="${basedir}/exe">
<fileset dir="${configs}"/>
</copy>
</target>
<target name="git_configs" description="got get server config files from swg source" if="${configs.exists}" depends="check_git_dirs">
<git-pull dir="${configs}" branch="${configs_branch}"/>
</target>
<!-- Creates the Make files for our SRC that will be used during compile stage (Intel) -->
<target name="prepare_src_x86" depends="init,get_arch" description="prepare server code - Intel" if="compile.x86">
<exec executable="cmake" dir="${build}" failonerror="true">
<env key="PATH" value="${env.PATH}:${tools_home}"/>
<env key="CC" value="clang"/>
<env key="CXX" value="clang++"/>
<env key="LDFLAGS" value="-L/usr/lib32"/>
<env key="CMAKE_PREFIX_PATH" value="/usr/lib32:/lib32:/usr/lib/i386-linux-gnu:/usr/include/i386-linux-gnu"/>
<arg value="-DCMAKE_C_FLAGS=-m32"/>
<arg value="-DCMAKE_CXX_FLAGS=-m32"/>
<arg value="-DCMAKE_EXE_LINKER_FLAGS=-m32"/>
<arg value="-DCMAKE_MODULE_LINKER_FLAGS=-m32"/>
<arg value="-DCMAKE_SHARED_LINKER_FLAGS=-m32"/>
<arg value="-DCMAKE_BUILD_TYPE=${src_build_type}"/>
<arg value="${src}"/>
</exec>
</target>
<!-- Creates the Make files for our SRC that will be used during compile stage (Non-Intel) -->
<target name="prepare_src" depends="init,get_arch" description="compile server code - non Intel" unless="compile.x86">
<exec executable="cmake" dir="${build}" failonerror="true">
<env key="PATH" value="${env.PATH}:${tools_home}"/>
<env key="CC" value="clang"/>
<env key="CXX" value="clang++"/>
<arg value="-DCMAKE_BUILD_TYPE=${src_build_type}"/>
<arg value="${src}"/>
</exec>
</target>
<target name="strip_src" unless="${is_debug_build}" description="removes debugging information from release builds, making them smaller">
<exec executable="strip" dir="${build}">
<arg value="-d"/>
<arg value="bin/*"/>
</exec>
</target>
<!-- Compiles the SRC (C++) code -->
<target name="compile_src" description="compile server code" depends="init,prepare_src,prepare_src_x86,get_num_procs,strip_src">
<exec executable="make" dir="${build}" failonerror="true">
<arg value="-j${nproc}"/>
</exec>
</target>
<!-- Compiles the DSRC (Java) code -->
<target name="compile_java" depends="init" description="compile java code">
<javac srcdir="${dsrc_server}" destdir="${data_server}" includeantruntime="false" classpath="${data_server}" encoding="utf8" sourcepath="${dsrc_server}" debug="true" deprecation="on">
<compilerarg value="-Xlint:-options"/>
</javac>
<antcall target="create_symlinks"/>
</target>
<!-- Compiles all code necessary for server execution -->
<target name="compile" depends="compile_src,compile_java,compile_miff,compile_tpf,compile_tab,load_templates">
</target>
<!-- Compiles all .mif files -->
<target name="compile_miff">
<fileset id="miff_files" dir="${dsrc}" includes="**/*.mif"/>
<touch mkdirs="true" verbose="false">
<fileset refid="miff_files"/>
<mapper type="glob" from="*.mif" to="${data}/*/.tmp" />
</touch>
<delete>
<fileset dir="${data}" includes="**/.tmp"/>
</delete>
<apply executable="./Miff" dir="${tools_home}" dest="${data}" parallel="false" type="file">
<env key="PATH" value="${env.PATH}:${tools_home}"/>
<arg value="-i"/>
<srcfile prefix="&quot;" suffix="&quot;"/>
<arg value="-o"/>
<targetfile prefix="&quot;" suffix="&quot;"/>
<fileset refid="miff_files"/>
<mapper type="glob" from="*.mif" to="*.iff"/>
</apply>
<antcall target="cleanup"/>
</target>
<!-- Compiles all .tab files -->
<target name="compile_tab">
<property name="server_datatables" location="${dsrc_server}/datatables"/>
<property name="shared_datatables" location="${dsrc_shared}/datatables"/>
<property name="include_datatables" location="${shared_datatables}/include"/>
<touch mkdirs="true" verbose="false">
<fileset dir="${dsrc}" includes="**/*.tab"/>
<mapper type="glob" from="*.tab" to="${data}/*/.tmp" />
</touch>
<delete>
<fileset dir="${data}" includes="**/.tmp"/>
</delete>
<apply executable="./DataTableTool" dir="${tools_home}" dest="${data}" parallel="false" type="file" failonerror="true">
<env key="PATH" value="${env.PATH}:${tools_home}"/>
<arg value="-i"/>
<srcfile prefix="&quot;" suffix="&quot;"/>
<arg value="-- -s SharedFile"/>
<arg value="searchPath10=${data_shared}"/>
<arg value="searchPath10=${data_server}"/>
<arg value="searchPath10=${data_server}"/>
<fileset dir="${dsrc}" includes="**/*.tab" excludes="**/object_template_crc_string_table.tab,**/quest_crc_string_table.tab"/>
<mapper type="glob" from="*.tab" to="*.iff"/>
</apply>
<antcall target="cleanup"/>
</target>
<!-- Compiles all Template Files (.tpf) -->
<target name="compile_tpf" description="compile the template files (*.tpf) into .iff">
<touch mkdirs="true" verbose="false">
<fileset dir="${dsrc}" includes="**/*.tpf"/>
<mapper type="glob" from="*.tpf" to="${data}/*/.tmp" />
</touch>
<delete>
<fileset dir="${data}" includes="**/.tmp"/>
</delete>
<apply executable="${tools_home}/TemplateCompiler" dir="${basedir}" dest="${basedir}" parallel="false" type="file" failonerror="false" relative="true">
<env key="PATH" value="${env.PATH}:${tools_home}"/>
<arg value="-compile"/>
<srcfile/>
<fileset dir="${basedir}" includes="${dsrc}/**/*.tpf"/>
<mapper type="glob" from="${dsrc}/*.tpf" to="${data}/*.iff"/>
</apply>
<antcall target="load_templates"/>
<antcall target="cleanup"/>
</target>
<!-- Creates the Object Template CRC file -->
<target name="build_object_template_crc" description="creates the object template crc file">
<exec executable="utils/build_object_template_crc_string_tables.py" dir="${basedir}">
<env key="PATH" value="${env.PATH}:${tools_home}"/>
</exec>
</target>
<!-- Creates the Quest CRC file -->
<target name="build_quest_crc" description="creates the quest crc file">
<exec executable="utils/build_quest_crc_string_tables.py" dir="${basedir}">
<env key="PATH" value="${env.PATH}:${tools_home}"/>
</exec>
</target>
<!-- Creates SQL (insert statements) to get all the CRC Templates into the database -->
<target name="process_templates" description="generates sql from generated crc files" depends="build_object_template_crc,build_quest_crc">
<exec executable="perl" dir="${basedir}/src/game/server/database/templates" input="${object_crc_file}" output="${templates_sql_file}">
<env key="PATH" value="${env.PATH}:${tools_home}"/>
<arg value="processTemplateList.pl"/>
</exec>
</target>
<!-- Executes the generated Template CRC SQL in SQL*Plus -->
<target name="load_templates" description="loads generated templates into the database" depends="process_templates">
<exec executable="sqlplus" dir="${build}">
<arg value="${db_username}/${db_password}@${service_name}"/>
<arg value="@${templates_sql_file}"/>
</exec>
</target>
<!-- Target used to create database tables -->
<target name="create_database" description="creates database tables from existing sql scripts" if="firstrun">
<replace file="build.properties" token="firstrun = true" value="firstrun = false"/>
<exec executable="perl" dir="${basedir}/src/game/server/database/build/linux">
<env key="PATH" value="${env.PATH}:${tools_home}"/>
<arg value="database_update.pl"/>
<arg value="--username=${db_username}"/>
<arg value="--password=${db_password}"/>
<arg value="--service=${service_name}"/>
<arg value="--goldusername=${db_username}"/>
<arg value="--loginusername=${db_username}"/>
<arg value="--createnewcluster"/>
<arg value="--packages"/>
</exec>
</target>
<!-- Target used to delete database tables - change properties file "firstrun" from "false" to "true" to enable execution -->
<target name="drop_database" description="creates database tables from existing sql scripts" if="firstrun">
<replace file="build.properties" token="firstrun = true" value="firstrun = false"/>
<exec executable="perl" dir="${basedir}/src/game/server/database/build/linux">
<env key="PATH" value="${env.PATH}:${tools_home}"/>
<arg value="database_update.pl"/>
<arg value="--username=${db_username}"/>
<arg value="--password=${db_password}"/>
<arg value="--service=${service_name}"/>
<arg value="--goldusername=${db_username}"/>
<arg value="--loginusername=${db_username}"/>
<arg value="--drop"/>
<arg value="--packages"/>
</exec>
</target>
<target name="create_symlinks" if="firstrun">
<symlink link="${basedir}/data/sku.0/sys.client/compiled/clientdata" resource="${clientdata}"/>
<symlink link="${basedir}/exe/linux/bin" resource="${tools_home}"/>
</target>
<target name="start" description="starts the server" depends="create_symlinks,stop">
<exec executable="bin/LoginServer" dir="${exe}/linux">
<arg value="--"/>
<arg value="@servercommon.cfg &amp;"/>
</exec>
<exec executable="sleep" dir="${exe}/linux">
<arg value="5"/>
</exec>
<exec executable="bin/TaskManager" dir="${exe}/linux">
<arg value="--"/>
<arg value="@servercommon.cfg"/>
</exec>
</target>
<target name="stop" description="stops the login server">
<exec executable="killall" dir="${basedir}">
<arg value="LoginServer"/>
</exec>
<exec executable="killall" dir="${basedir}">
<arg value="CentralServer"/>
</exec>
<exec executable="killall" dir="${basedir}">
<arg value="ChatServer"/>
</exec>
<exec executable="killall" dir="${basedir}">
<arg value="CommoditiesServer"/>
</exec>
<exec executable="killall" dir="${basedir}">
<arg value="ConnectionServer"/>
</exec>
<exec executable="killall" dir="${basedir}">
<arg value="CustomerServiceServer"/>
</exec>
<exec executable="killall" dir="${basedir}">
<arg value="LogServer"/>
</exec>
<exec executable="killall" dir="${basedir}">
<arg value="MetricsServer"/>
</exec>
<exec executable="killall" dir="${basedir}">
<arg value="PlanetServer"/>
</exec>
<exec executable="killall" dir="${basedir}">
<arg value="ServerConsole"/>
</exec>
<exec executable="killall" dir="${basedir}">
<arg value="SwgDatabaseServer"/>
</exec>
<exec executable="killall" dir="${basedir}">
<arg value="SwgGameServer"/>
</exec>
<exec executable="killall" dir="${basedir}">
<arg value="TransferServer"/>
</exec>
<exec executable="killall" dir="${basedir}">
<arg value="TaskManager"/>
</exec>
</target>
<!-- Cleans up empty folders from the build folder -->
<target name="cleanup" description="Clean up">
<delete includeemptydirs="true">
<fileset dir="${data}">
<and>
<size value="0"/>
<type type="dir"/>
</and>
</fileset>
</delete>
</target>
</project>

362
build_linux.sh Executable file
View File

@@ -0,0 +1,362 @@
#!/bin/bash
basedir=$PWD
PATH=$PATH:$basedir/build/bin
DBSERVICE=
DBUSERNAME=
DBPASSWORD=
HOSTIP=
CLUSTERNAME=
NODEID=
DSRC_DIR=
DATA_DIR=
GIT_URL="https://bitbucket.org/stellabellumswg/"
GIT_REPO_SRC=${GIT_URL}src.git
GIT_REPO_DSRC=${GIT_URL}dsrc.git
GIT_REPO_CONFIG=${GIT_URL}configs.git
GIT_REPO_DATA=${GIT_URL}data.git
# Debug and Release are for testing and not public servers, they lack optimizations
#MODE=Release
MODE=Debug
# Public/high load release builds - heavily optimized bins
#MODE=MINSIZEREL
# Builds binaries to generate profdata
#MODE=RELWITHDEBINFO
if [ ! -d $basedir/build ]
then
mkdir $basedir/build
fi
if [ ! -f $basedir/.setup ]; then
if [[ $(lsb_release -a) =~ .*Ubuntu.* ]] || [ -f "/etc/debian_version" ]
then
read -p "!!!ONLY RUN ONCE!!! Do you want to install dependencies (y/n)?" response
response=${response,,} # tolower
if [[ $response =~ ^(yes|y| ) ]]; then
$basedir/utils/init/debian.sh
source /etc/profile.d/java.sh
source /etc/profile.d/oracle.sh
touch $basedir/.setup
echo "Please login and out or reboot as changes have been made to your PATH "
fi
fi
fi
read -p "Do you want to pull/update git? (y/n) " response
response=${response,,} # tolower
if [[ $response =~ ^(yes|y| ) ]]; then
# update main repo
git pull
# update or clone each sub-repo
if [ ! -d $basedir/src ]; then
git clone $GIT_REPO_SRC src
else
cd $basedir/src
git pull
cd $basedir
fi
if [ ! -d $basedir/dsrc ]; then
git clone $GIT_REPO_DSRC dsrc
else
cd $basedir/dsrc
git pull
cd $basedir
fi
if [ ! -d $basedir/configs ]; then
git clone $GIT_REPO_CONFIG configs
else
cd $basedir/configs
git pull
cd $basedir
fi
fi
read -p "Do you want to recompile the server code (C++) now? (y/n) " response
response=${response,,} # tolower
if [[ $response =~ ^(yes|y| ) ]]; then
cd $basedir/build
# prefer clang
if type clang &> /dev/null; then
export CC=clang
export CXX=clang++
fi
if [ $(arch) == "x86_64" ]; then
export LDFLAGS=-L/usr/lib32
export CMAKE_PREFIX_PATH="/usr/lib32:/lib32:/usr/lib/i386-linux-gnu:/usr/include/i386-linux-gnu"
cmake -DCMAKE_C_FLAGS=-m32 \
-DCMAKE_CXX_FLAGS=-m32 \
-DCMAKE_EXE_LINKER_FLAGS=-m32 \
-DCMAKE_MODULE_LINKER_FLAGS=-m32 \
-DCMAKE_SHARED_LINKER_FLAGS=-m32 \
-DCMAKE_BUILD_TYPE=$MODE \
$basedir/src
else
cmake $basedir/src -DCMAKE_BUILD_TYPE=$MODE
fi
make -j$(nproc)
cd $basedir
# Miff isn't playing nice with clang in release mode
if [ $CC = "clang" ]; then
if [ ! -d "miff" ]; then
mkdir miff
fi
cd miff
if [ $(arch) == "x86_64" ]; then
export LDFLAGS=-L/usr/lib32
export CMAKE_PREFIX_PATH="/usr/lib32:/lib32:/usr/lib/i386-linux-gnu:/usr/include/i386-linux-gnu"
cmake -DCMAKE_C_FLAGS=-m32 -D_MIFF=1 \
-DCMAKE_CXX_FLAGS=-m32 \
-DCMAKE_EXE_LINKER_FLAGS=-m32 \
-DCMAKE_MODULE_LINKER_FLAGS=-m32 \
-DCMAKE_SHARED_LINKER_FLAGS=-m32 \
-DCMAKE_BUILD_TYPE=Debug \
$basedir/src
else
cmake $basedir/src -DCMAKE_BUILD_TYPE=Debug -D_MIFF=1
fi
cd engine/client/application/Miff
make -j$(nproc)
cd $basedir
mv miff/bin/Miff build/bin/
fi;
if [ $MODE = "Release" ] || [ $MODE = "MINSIZEREL" ]; then
strip -s $basedir/build/bin/*
fi
cd $basedir;
fi
read -p "Do you want to build the config environment now? (y/n) " response
response=${response,,} # tolower
if [[ $response =~ ^(yes|y| ) ]]; then
# Prompt for configuration environment.
read -p "Configure environment (local, live, tc, design)? You probably want local. " config_env
# Make sure the configuration environment exists.
if [ ! -d $basedir/configs/$config_env ]; then
echo "Invalid configuration environment."
exit
fi
echo "Enter your IP address (LAN for port forwarding or internal, outside IP for DMZ)"
read HOSTIP
echo "Enter the DSN for the database connection "
read DBSERVICE
echo "Enter the database username "
read DBUSERNAME
echo "Enter the database password "
read DBPASSWORD
echo "Enter a name for the galaxy cluster "
read CLUSTERNAME
if [ -d $basedir/exe ]; then
rm -rf $basedir/exe
fi
mkdir -p $basedir/exe/linux/logs
mkdir -p $basedir/exe/shared
ln -s $basedir/build/bin $basedir/exe/linux/bin
cp -u $basedir/configs/$config_env/linux/* $basedir/exe/linux
cp -u $basedir/configs/$config_env/shared/* $basedir/exe/shared
for filename in $(find $basedir/exe -name '*.cfg'); do
sed -i -e "s@DBSERVICE@$DBSERVICE@g" -e "s@DBUSERNAME@$DBUSERNAME@g" -e "s@DBPASSWORD@$DBPASSWORD@g" -e "s@CLUSTERNAME@$CLUSTERNAME@g" -e "s@HOSTIP@$HOSTIP@g" $filename
done
#
# Generate other config files if their template exists.
#
# Generate at least 1 node that is the /etc/hosts IP.
$basedir/utils/build_node_list.sh
fi
if [ ! -d $basedir/data ]; then
read -p "Symlink to data directory (y/n)? " remote_dsrc_response
remote_dsrc_response=${remote_dsrc_response,,}
if [[ $remote_dsrc_response =~ ^(yes|y| ) ]]; then
read -p "Enter target data directory " DATA_DIR
ln -s $DATA_DIR ./data
fi
fi
read -p "Do you want to recompile the scripts (.java)? (y/n) " response
response=${response,,} # tolower
if [[ $response =~ ^(yes|y| ) ]]; then
#prepare environment to run data file builders
oldPATH=$PATH
PATH=$basedir/build/bin:$PATH
read -p "Do you wanna use multi-core building (default) or stick with the safe option? You may need to rerun the single version if there are stragglers. (multi/safe) " response
response=${response,,}
if [[ $response =~ ^(multi|m| ) ]]; then
$basedir/utils/build_java_multi.sh
else
$basedir/utils/build_java.sh
fi
PATH=$oldPATH
fi
buildTemplates=false
read -p "Do you want to build the mIFF files (.mif)? (y/n) " response
response=${response,,}
if [[ $response =~ ^(yes|y| ) ]]; then
#prepare environment to run data file builders
oldPATH=$PATH
PATH=$basedir/build/bin:$PATH
$basedir/utils/build_miff.sh
buildTemplates=true
PATH=$oldPATH
fi
read -p "Do you want to build the datatables (.tab)? (y/n) " response
response=${response,,}
if [[ $response =~ ^(yes|y| ) ]]; then
#prepare environment to run data file builders
oldPATH=$PATH
PATH=$basedir/build/bin:$PATH
read -p "Do you wanna use multi-core building (default) or stick with the safe option? You may need to rerun the single version if there are stragglers. (multi/safe) " response
response=${response,,}
if [[ $response =~ ^(multi|m| ) ]]; then
$basedir/utils/build_tab_multi.sh
else
$basedir/utils/build_tab.sh
fi
buildTemplates=true
PATH=$oldPATH
fi
read -p "Do you want to build the template files (.tpf)? (y/n) " response
response=${response,,}
if [[ $response =~ ^(yes|y| ) ]]; then
#prepare environment to run data file builders
oldPATH=$PATH
PATH=$basedir/build/bin:$PATH
read -p "Do you wanna use multi-core building (default) or stick with the safe option? You may need to rerun the single version if there are stragglers. (multi/safe) " response
response=${response,,}
if [[ $response =~ ^(multi|m| ) ]]; then
$basedir/utils/build_tpf_multi.sh
else
$basedir/utils/build_tpf.sh
fi
buildTemplates=true
PATH=$oldPATH
fi
if [[ $buildTemplates = false ]]; then
read -p "Do you want to build the Object Template or Quest CRC files? (y/n) " response
response=${response,,}
if [[ $response =~ ^(yes|y| ) ]]; then
buildTemplates=true
fi
fi
templatesLoaded=false
if [[ $buildTemplates = true ]]; then
echo "Object Template and Quest CRC files will now be built and re-imported into the database."
if [[ -z "$DBSERVICE" ]]; then
echo "Enter the DSN for the database connection "
read DBSERVICE
fi
if [[ -z "$DBUSERNAME" ]]; then
echo "Enter the database username "
read DBUSERNAME
fi
if [[ -z "$DBPASSWORD" ]]; then
echo "Enter the database password "
read DBPASSWORD
fi
#prepare environment to run data file builders
oldPATH=$PATH
PATH=$basedir/build/bin:$PATH
$basedir/utils/build_object_template_crc_string_tables.py
$basedir/utils/build_quest_crc_string_tables.py
cd $basedir/src/game/server/database
echo "Loading template list"
perl ./templates/processTemplateList.pl < $basedir/dsrc/sku.0/sys.server/built/game/misc/object_template_crc_string_table.tab > $basedir/build/templates.sql
sqlplus ${DBUSERNAME}/${DBPASSWORD}@${DBSERVICE} @$basedir/build/templates.sql > $basedir/build/templates.out
templatesLoaded=true
cd $basedir
PATH=$oldPATH
fi
read -p "Import database? (y/n) " response
response=${response,,}
if [[ $response =~ ^(yes|y| ) ]]; then
cd $basedir/src/game/server/database/build/linux
if [[ -z "$DBSERVICE" ]]; then
echo "Enter the DSN for the database connection "
read DBSERVICE
fi
if [[ -z "$DBUSERNAME" ]]; then
echo "Enter the database username "
read DBUSERNAME
fi
if [[ -z "$DBPASSWORD" ]]; then
echo "Enter the database password "
read DBPASSWORD
fi
./database_update.pl --username=$DBUSERNAME --password=$DBPASSWORD --service=$DBSERVICE --goldusername=$DBUSERNAME --loginusername=$DBUSERNAME --createnewcluster --packages
if [[ $templatesLoaded = false ]]; then
echo "Loading template list"
perl ../../templates/processTemplateList.pl < ../../../../../../dsrc/sku.0/sys.server/built/game/misc/object_template_crc_string_table.tab > $basedir/build/templates.sql
sqlplus ${DBUSERNAME}/${DBPASSWORD}@${DBSERVICE} @$basedir/build/templates.sql > $basedir/build/templates.out
fi
fi
echo "Build complete!"

183
git_targets.xml Executable file
View File

@@ -0,0 +1,183 @@
<?xml version="1.0" encoding="UTF-8"?>
<project name="git_targets" basedir="." >
<!-- ===============================
Git tasks
==================================== -->
<macrodef name = "git">
<attribute name = "command" />
<attribute name = "dir" default = "" />
<element name = "args" optional = "true" />
<sequential>
<echo message = "git @{command}" />
<exec executable = "git" dir = "@{dir}" failonerror="true">
<arg value = "@{command}" />
<args/>
</exec>
</sequential>
</macrodef>
<macrodef name = "git-clone-pull">
<attribute name = "repository" />
<attribute name = "dest" />
<attribute name = "branch" />
<sequential>
<git command = "clone">
<args>
<arg value = "@{repository}" />
<arg line = "-b @{branch}" />
<arg value = "@{dest}" />
</args>
</git>
<git-pull branch="@{branch}" dir="@{dest}" />
</sequential>
</macrodef>
<macrodef name = "git-pull">
<attribute name = "dir" />
<attribute name = "branch" />
<sequential>
<echo>Pulling @{dir}, branch: @{branch}</echo>
<git command = "pull" dir="@{dir}" >
<args>
<arg value = "origin" />
<arg value = "@{branch}" />
</args>
</git>
</sequential>
</macrodef>
<macrodef name = "git-push">
<attribute name = "dir" />
<attribute name = "branch" />
<sequential>
<echo>Pushing @{dir} to @{branch}</echo>
<git command = "push" dir="@{dir}" >
<args>
<arg value = "origin" />
<arg value = "@{branch}" />
</args>
</git>
</sequential>
</macrodef>
<macrodef name = "git-checkout">
<attribute name = "branch" />
<attribute name = "dir" />
<sequential>
<echo>Checking out @{branch} branch</echo>
<git command="checkout" dir="@{dir}">
<args>
<arg value="@{branch}" />
</args>
</git>
</sequential>
</macrodef>
<macrodef name = "git-commit">
<attribute name = "comment" />
<attribute name = "file" />
<attribute name = "dir" />
<attribute name = "branch" />
<sequential>
<echo>Commiting @{file} to git on branch @{branch}</echo>
<git command="add" dir="@{dir}">
<args>
<arg value="@{file}" />
</args>
</git>
<git command="commit" dir="@{dir}">
<args>
<arg value="@{file}" />
<arg line="-m '${comment}'" />
</args>
</git>
<git-push branch="${branch}" dir="@{dir}" />
</sequential>
</macrodef>
<macrodef name = "git-tag">
<attribute name = "comment" />
<attribute name = "dir" />
<attribute name = "version" />
<sequential>
<if>
<isset property="dont_zip" />
<then/>
<else>
<echo>Tagging @{dir} @{version} </echo>
<git command="tag" dir="@{dir}">
<args>
<arg line="-a @{version}" />
<arg line="-m '@{comment}'" />
</args>
</git>
<git-push branch="@{version}" dir="@{dir}" />
</else>
</if>
</sequential>
</macrodef>
<!--
<target name="git_tag">
<git command="tag" dir="staging/${extn}">
<args>
<arg line="-a ${next_version}" />
<arg line="-m '${comment}'" />
</args>
</git>
<git-push-tag branch="${next_version}" dir="staging/${extn}" />
</target>-->
<!-- ===============================
Modman tasks
==================================== -->
<macrodef name="modman">
<attribute name="command" />
<attribute name="dir" default="" />
<element name="args" optional="true" />
<sequential>
<echo message="modman @{command} in @{dir}" />
<exec executable="modman" dir="@{dir}">
<arg value="@{command}" />
<args/>
</exec>
</sequential>
</macrodef>
<macrodef name="modman-clone">
<attribute name="repos" />
<attribute name="dest" />
<attribute name="branch" />
<sequential>
<modman command="clone" dir="@{dest}">
<args>
<arg value="@{repos}" />
<arg line="--branch @{branch}" />
</args>
</modman>
</sequential>
</macrodef>
<macrodef name="modman-clone-overlay">
<attribute name="extn" />
<attribute name="dest" />
<attribute name="branch" />
<attribute name="repos" />
<sequential>
<echo>Cloning @{extn} with branch @{branch} (named: @{extn}@{branch})</echo>
<modman command="clone" dir="@{dest}">
<args>
<arg value="@{extn}@{branch}" />
<arg value="@{repos}" />
<arg line="--branch @{branch}" />
</args>
</modman>
</sequential>
</macrodef>
<macrodef name="modman-update">
<attribute name="dir" />
<attribute name="extn" />
<sequential>
<echo>Updating @{dir}, extn: @{extn}</echo>
<modman command="update" extn="@{extn}" dir="@{dest}" />
</sequential>
</macrodef>
</project>