Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
35 commits
Select commit Hold shift + click to select a range
17c372d
Added XZ archive support
top-sekret Dec 16, 2018
7f5a93b
When checking dependencies, check for (g)libtoolize instead of (g)lib…
top-sekret Dec 16, 2018
c33cc3c
Update GCC to version 8.2.0
top-sekret Dec 16, 2018
9164eee
Put common parts of both GCC stages in one file.
top-sekret Dec 16, 2018
17c0a04
Dispose of duplicate MIPS_ATYPE_HI
top-sekret Dec 16, 2018
4739a43
GMP, MPFR, MPC and ISL are not longer a build requirement
top-sekret Dec 16, 2018
4819798
Remove lto from languages (GCC doesn't seem to like this)
top-sekret Dec 16, 2018
57d78db
Fix a build error of stage 2 GCC
top-sekret Dec 16, 2018
cd233d4
Fix the actual issue with GCC build and LTO
top-sekret Dec 16, 2018
3045dc9
Fused the prepare scripts into one script.
top-sekret Dec 16, 2018
38281af
Generally reworked scripting system.
top-sekret Dec 18, 2018
3a6e1a2
Fixed a bug when if archives were partial they would not be extracted…
top-sekret Dec 18, 2018
ea86413
Added an initial GitLab CI configuration
top-sekret Apr 11, 2019
093f171
Run builds in parallel
top-sekret Apr 11, 2019
c48c191
Some improvements to README.md and prepare.sh
top-sekret Apr 11, 2019
d457fbb
Use pigz -5 -p4 instead of xz -6e to improve compression performance
top-sekret Apr 11, 2019
0d23d6c
Remove all compression from job artifacts
top-sekret Apr 11, 2019
6dc0ad7
Merge branch 'gitlab-ci'
top-sekret Apr 11, 2019
df2bfe8
Update GCC to 8.3.0
top-sekret Apr 11, 2019
7611fd3
Merge branch 'gcc-8.3.0'
top-sekret Apr 11, 2019
f081ec9
Disposed of Insight, which is deprecated and unsupported
top-sekret Apr 11, 2019
71b8fbc
Updated binutils to 2.23.1
top-sekret Apr 12, 2019
bc14dd3
Fixed readline.h dependency check on Manjaro.
GTgunner Oct 7, 2019
53a2646
Merge pull request #1 from GTgunner/master
top-sekret Feb 9, 2020
22cbdea
Fix broken if check
sharkwouter Apr 2, 2020
6c89a0f
Merge pull request #2 from sharkwouter/patch-1
top-sekret Apr 5, 2020
5439498
Remove USER variable also from checks for macOS
top-sekret Apr 5, 2020
728cb11
Enable plugin support
wally4000 Apr 11, 2020
ab2e342
Merge pull request #4 from wally4000/master
top-sekret Apr 23, 2020
78fb32d
Upated GCC to 9.3
dbeef Apr 16, 2020
bbf23b8
Added `--depth 1` to clone_git_repo function.
dbeef Apr 15, 2020
6436bda
Removed newlib patch; cloning pspdev fork with patch already applied.
dbeef Apr 15, 2020
9283491
Deleted GitLab CI
top-sekret Apr 23, 2020
53e4e16
Removed stale automake1.9 and added libtool-bin for "libtool" binary.
flipbit03 Oct 30, 2017
ee729b7
Upgraded to binutils 2.23.2.
top-sekret Apr 23, 2020
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
108 changes: 66 additions & 42 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,67 +3,91 @@ What does this do?

This program will automatically build and install a compiler and other tools used in the creation of homebrew software for the Sony Playstation Portable handheld videogame system.

How do I use it?
==================

1. Set up your environment by installing the following software:
Dependencies
============

autoconf, automake, bison, flex, gcc, g++/gcc-c++, libusb-dev, make, ncurses, patch, readline, subversion, texinfo, wget, mpc, gmp, libelf, mpfr, git
PSP toolchain depends on the following packages:

2. Set the PSPDEV and PATH environmental variables:
```
autoconf, automake, bison, bzip2, flex, gcc, g++/gcc-c++, gzip, libusb-dev, make, ncurses, patch, readline, subversion, texinfo, xz-utils, wget, libelf git
```

export PSPDEV=/usr/local/pspdev
export PATH=$PATH:$PSPDEV/bin
Automatic dependency installation
---------------------------------

The PSPDEV variable is the directory the toolchain will be installed to, change this if you wish. If possible the toolchain script will automatically add these variables to your systems login scripts, otherwise you will need to manually add these variables yourself.
On both Linux and macOS, you can install the dependencies using a script:

3. Run the toolchain script:
```shell
sudo ./prepare.sh
```

./toolchain.sh
This method is much easier than installing the packages manually and thus it's
the preferred method, especially for newcomers.

> NOTE: If you have issues with compiling try increasing the amount of memory
> available to your system by creating a swapfile.
>
> dd if=/dev/zero of=/swapfile bs=1M count=2048
> chmod 600 /swapfile
> mkswap /swapfile
> swapon /swapfile
> NOTE: This script might fail. In such case you need to revert to manual
> installation as described above. Currently, the preparation script supports
> Linux with APT package manager (Debian, Ubuntu and derivatives)
> and macOS with [MacPorts](https://www.macports.org/) or
> [Homebrew](https://brew.sh).
>
> After you are done use `swapoff -a` disable the swapfile. Finally you can
> remove it with `rm`.
> If you are using Homebrew, do not use ``sudo``.

Ubuntu
------
Docker dependency installation
------------------------------

1. Install the required packages by running.
If you are using a Docker container or don't mind using one, you can use the
image ``topsekretpl/psptoolchain-ci`` from the Registry. It contains all
dependencies bundled into an upstream Debian installation. This is not
recommended for newcomers or people not accustomed with Docker.

sudo apt-get install g++ build-essential autoconf automake cmake doxygen bison flex libncurses5-dev libsdl1.2-dev libreadline-dev libusb-dev texinfo libgmp3-dev libmpfr-dev libelf-dev libmpc-dev libfreetype6-dev zlib1g-dev libtool libtool-bin subversion git tcl unzip wget
If you want to build PSP toolchain in a docker image, do:

2. Build and install the toolchain and SDK.
```shell
sudo docker pull topsekretpl/psptoolchain-ci:latest
sudo docker run --interactive --rm --tty <other options...> topsekretpl/psptoolchain-ci:latest /bin/bash -i -l
```

sudo ./toolchain-sudo.sh

> NOTE: If you do not wish for the toolchain to be installed in /usr/local/pspdev then edit toolchain-sudo.sh and change the INSTALLDIR variable.
Note that this image does **not** contain PSP toolchain, solely the
dependencies.

Building
========

To build the toolchain and install it in ``/usr/local/pspdev``, run:

OSX
---
```shell
sudo ./toolchain-sudo.sh
```

1. Install [`port`][MacPorts] or [`brew`][HomeBrew].
2. Run `prepare-mac-os.sh`. This will auto-install all the libraries you will need before building.

sudo ./prepare-mac-os.sh
Alternatively, there is another script that allows to install the toolchain in
a subdirectory of this repository:

3. Build and install the toolchain and SDK.
sudo ./toolchain-sudo.sh
```shell
./toolchain-local.sh
```

Where do I go from here?
========================
Both are wrappers around ``toolchain.sh`` which is slightly more sophisticated
but much more powerful.

> NOTE: If you do not wish for the toolchain to be installed in
> /usr/local/pspdev then edit run ``toolchain.sh`` manually. The ``-h`` option
> of ``toolchain.sh`` provides all necessary help.

> NOTE: If you have issues with compiling try increasing the amount of memory
> available to your system by creating a swapfile.
>
> dd if=/dev/zero of=/swapfile bs=1M count=2048
> chmod 600 /swapfile
> mkswap /swapfile
> swapon /swapfile
>
> After you are done use ``swapoff /swapfile`` to disable the swapfile. Then
> you should remove it with ``rm``.

Next steps
==========

Visit the following sites to learn more:

http://www.ps2dev.org
http://forums.ps2dev.org

[MacPorts]: http://www.macports.org/
[HomeBrew]: http://brew.sh/
179 changes: 116 additions & 63 deletions common.sh
100755 → 100644
Original file line number Diff line number Diff line change
@@ -1,77 +1,130 @@
# Returns the number of processor cores available
# Usage: num_cpus
function num_cpus
#!/bin/sh

# Run the tar command.
# Usage: run_tar <directory> <tar options...>
run_tar ()
{
# This *should* be available on literally everything, including OSX
getconf _NPROCESSORS_ONLN
DIRECTORY="$1"
shift

tar --directory="$DIRECTORY" --no-same-owner --strip-components=1 "$@"
}

# Extracts a file based on its extension
# Usage: extract <archive>
function auto_extract
# Usage: extract <archive> <directory>
auto_extract ()
{
path=$1
name=`echo $path|sed -e "s/.*\///"`
ext=`echo $name|sed -e "s/.*\.//"`

echo "Extracting $name..."

case $ext in
"tar") tar --no-same-owner -xf $path ;;
"gz"|"tgz") tar --no-same-owner -xzf $path ;;
"bz2"|"tbz2") tar --no-same-owner -xjf $path ;;
"zip") unzip $path ;;
*) echo "I don't know how to extract $ext archives!"; return 1 ;;
esac

return $?
FILE="$1"
NAME="$(echo "$FILE" | sed -e 's|^.*/||')"
EXTENSION="$(echo "$NAME" | sed -e 's|.*\.||')"
DIRECTORY="$2"

if [ -d "$DIRECTORY" ]
then
echo "Deleting existing $DIRECTORY"
rm -rf "$DIRECTORY"
fi

mkdir -p "$DIRECTORY"

echo "Extracting $NAME"

case "$EXTENSION" in
tar)
run_tar "$DIRECTORY" -xf "$FILE"
;;
gz | tgz)
run_tar "$DIRECTORY" -xzf "$FILE"
;;
bz2 | tbz2)
run_tar "$DIRECTORY" -xjf "$FILE"
;;
xz | txz)
run_tar "$DIRECTORY" -xJf "$FILE"
;;
*)
cat >&2 <<_EOF_
Unsupported archive type: $EXTENSION
_EOF_
return 1
;;
esac

return $?
}

# Downloads and extracts a file, with some extra checks.
# Usage: download_and_extract <url> <output?>
function download_and_extract
# Usage: download_and_extract <url> <output directory>
download_and_extract ()
{
url=$1
name=`echo $url|sed -e "s/.*\///"`
outdir=$2

# If there are already an extracted directory, delete it, otherwise
# reapplying patches gets messy. I tried.
[ -d $outdir ] && echo "Deleting old version of $outdir" && rm -rf $outdir

# First, if the archive already exists, attempt to extract it. Failing
# that, attempt to continue an interrupted download. If that also fails,
# remove the presumably corrupted file.
[ -f $name ] && auto_extract $name || { wget --continue --no-check-certificate $url -O $name || rm -f $name; }

# If the file does not exist at this point, it means it was either never
# downloaded, or it was deleted for being corrupted. Just go ahead and
# download it.
# Using wget --continue here would make buggy servers flip out for nothing.
[ -f $name ] || wget --no-check-certificate $url -O $name && auto_extract $name
URL="$1"
NAME="$(echo "$URL" | sed -e 's|^.*/||;s|\?.*$||')"
OUT_DIR="$2"

# First, if the archive already exists, attempt to extract it. Failing
# that, attempt to continue an interrupted download. If that also fails,
# remove the presumably corrupted file.
if [ -f "$NAME" ]
then
if auto_extract "$NAME" "$OUT_DIR"
then
return 0
else
wget --continue --no-check-certificate "$URL" -O "$NAME" || rm -f "$NAME"
fi
fi

# If the file does not exist at this point, it means it was either never
# downloaded, or it was deleted for being corrupted. Just go ahead and
# download it.
# Using wget --continue here would make buggy servers flip out for nothing.
if ! [ -f "$NAME" ]
then
wget --no-check-certificate "$URL" -O "$NAME" || return 1
fi

auto_extract "$NAME" "$OUT_DIR"
}

# Runs Git in a way that won't lock waiting for the user or anything.
# Usage: git_noninteractive <normal git args...>
git_noninteractive ()
{
SSH_ASKPASS=false git "$@" </dev/null
}

# Clones or updates a Git repository.
# Usage: clone_git_repo <hostname> <user> <repo> <branch>
function clone_git_repo
# Usage: clone_git_repo <url> <dir> [branch]
clone_git_repo ()
{
URL="$1"
LOCAL_DIR="$2"
BRANCH="${3:-master}"

# It is possible that this is an actual copy of the repository we are
# interested in.
if [ -d "$LOCAL_DIR/.git" ] \
&& git_noninteractive -C "$LOCAL_DIR" status >/dev/null 2>&1 \
&& [ "x$(git_noninteractive -C "$LOCAL_DIR" remote get-url origin 2>/dev/null)" = "x$URL" ]
then
echo "Updating existing repository in $LOCAL_DIR"
cd "$LOCAL_DIR" || return 1
git_noninteractive pull --rebase origin "$BRANCH" || return 1
git_noninteractive reset --hard || return 1
git_noninteractive clean -dfx || return 1
cd - >/dev/null 2>&1 || return 1
return 0
else
echo "Deleting existing $LOCAL_DIR"
rm -rf "$LOCAL_DIR"
fi

git_noninteractive clone --recursive --depth 1 -b "$BRANCH" "$URL" "$LOCAL_DIR" || return 1
}

# Runs make with our options.
# Usage: run_make <normal make args...>
run_make ()
{
host=$1
user=$2
repo=$3
branch=${4:-master}

OLDPWD=$PWD

# Try to update an existing repository at the target path.
# Nuke it if it's corrupted and the pull fails.
[ -d $repo/.git ] && { cd $repo && git pull; } || rm -rf $OLDPWD/$repo

# The above command may leave us standing in the existing repo.
cd $OLDPWD

# If it does not exist at this point, it was never there in the first place
# or it was nuked due to being corrupted. Clone and track master, please.
# Attempt to clone over SSH if possible, use anonymous HTTP as fallback.
# Set SSH_ASKPASS and stdin(<) to prevent it from freezing to ask for auth.
[ -d $repo ] || SSH_ASKPASS=false git clone --recursive -b $branch git@$host:$user/$repo.git $repo < /dev/null || SSH_ASKPASS=false git clone --recursive -b $branch https://$host/$user/$repo.git $repo < /dev/null || return 1
make -j"$JOBS" "$@"
}
Loading