automation

Terraform and Github actions for Vrising hosting on AWS

Itā€™s been awhile since the last time I play Vrising, but I think this would be a good project for me to get my hands on setting up a CICD pipeline with terraform and github actions (an upgraded version from my AWS Vrising hosting solution).

There are a few changes to the original solution, first one is the use of vrising docker image (thanks to TrueOsiris), instead of manually install vrising server to the ec2 instance. Docker container would be started as part of the ec2 user data. Hereā€™s the user data script.

The second change is terraform configurations turning all the manual setup processes into IaC. Note, on the ec2 instance resource, we have a ā€˜home_cdir_blockā€™ variable, referencing an input from github actions secret. So then only the IPs in ā€˜home_cdir_blockā€™ can connect to our server. Another layer of protection is the serverā€™s password in user data script which also getting input from github secret variable.

Terraform resources would then get deploy out by github actions with OIDC configured to assume a role in AWS. The configuraiton process can be found here. The IAM role I set up for this project is attached with ā€˜AmazonEC2FullAccessā€™ and the below inline policy:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "VisualEditor0",
"Effect": "Allow",
"Action": [
"s3:PutObject",
"s3:GetObject",
"s3:ListBucket",
"s3:GetObjectVersion"
],
"Resource": [
"arn:aws:s3:::<your-s3-bucket-name>",
"arn:aws:s3:::<your-s3-bucket-name>/*"
]
}
]
}

Oh I forgot to mention, we also need an S3 bucket create to store the tfstate file as stated in _provider.tf.

Below is an overview of the upgraded solution.

Github repo: https://github.com/tduong10101/Vrising-aws

Forza log streaming to Opensearch

In this project I attempted to get forza log display in ā€˜real timeā€™ on AWS Opensearch (poorman splunk). Below is a quick overview of how to the log flow and access configurations.

Forza -> Pi -> Firehose data stream:

Setting up log streaming from forza to raspberry pi is quite straight forward. I forked jasperan forza-horizon-5-telemetry-listener repo and updated it with a delay functionality and also a function to send the log to aws firehose data stream (forked repo). Then I just got the python script run while Iā€™m playing on forza

Opensearch/Cognito configuration:

Ok this is the hard part. I spent most of the time on this. Firstly, to have firehose stream data into opensearch I need to somehow map the aws access roles to opensearch roles. ā€˜Fine grain accessā€™ option will not work, we can either use SAML to hook up to an idp or use the inhouse aws cognito. Since I donā€™t have an existing idp, I had to setup cognito identity pool and user pool. From there I can then give the admin opensearch role to cognito authenticate role which assigned to a user pool group.

Below are some screenshots on opensearch cluster and cognito. Also thanks to Soumil Shah his video on setting up congito with opensearch helped alot. Hereā€™s the link.

opensearch configure

opensearch security configure

opensearch role

Note: all this can be by pass if I choose to send the log straight to opensearch via https rest. But then it would be too easy ;)

Note 2: Iā€™ve gone with t3.small.search which is why I put real-time in quotation

Firehose data stream -> firehose delivery stream -> Opensearch:

Setting up delivery stream is not too hard. Thereā€™s a template for sending log to opensearch. Just remember to give it the right access role that.

Here is what it looks like on opensearch:

Iā€™m too tired to build any dashboard for it. Also the timestamp from the log didnā€™t get transform into ā€˜dateā€™ type, so Iā€™ll need to look into it at another time.

Improvements:

  • docker setup to run the python listener/log stream script
  • maybe stream log straight to opensearch for real time log? I feel insecure sending username/password with the payload though.
  • do this but with splunk? Iā€™m sure the indexing performance would be much better. Thereā€™s arealdy an addon for forza on splunk, but itā€™s not available on splunk cloud. The addon is where I got the idea for this project.

Spotify - Time machine

ā€œMusic is the closest thing we have to a time machineā€

Want to go back in time and listen to popular songs of that time? With a bit of web scraping, it is possible search/create a playlist of top songs in spotify with any given date in the last 20 years.

Where does the idea come from?

This is one of the projects in the 100 days of python challenge - here is the link.

How does it work?

This script would scrap for top 100 billboard songs on userā€™s input date. Then it would create a spotify playlist with following title format ā€œhot-100-ā€œ

To set it up please follow ā€œgetting startedā€ in this git repo. Here is an overview flow:

Challenges

  • The spotify api authenticate integration is complex and hard to get my head around. I ended up using spotipy python module instead of directly request to spotify api.

  • Took me a bit of time to figure out the billboard site structure. Overall not too bad, I cheated a bit and checked the solution. However, this is my own code version.

  • Duplicate playlist! Not sure if this is a bug, but playlist created by the script will not show in user_playlists(). So if a date is re-input, a duplicate playlist will get created.

Get AWS IAM credentials report script

Quick powershell script to generate and save AWS IAM credentials report to csv format on a local location.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
Import-Module AWSPowerShell
$reportLocation = "C:\report"
if (!(test-path($reportLocation))){
New-Item -ItemType Directory -Path $reportLocation
}
$date = get-date -Format dd-MM-yy-hh-mm-ss
$reportName = "aws-credentials-report-$date.csv"
$reportPath = Join-Path -Path $reportLocation -ChildPath $reportName
# request iam credential report to be generated
do {
$result = Request-IAMCredentialReport
Start-Sleep -Seconds 10
} while ($result.State.Value -notmatch "COMPLETE")
# get iam report
$report = Get-IAMCredentialReport -AsTextArray
# convert to powershell object
$report = $report|ConvertFrom-Csv
# export to set location
$report | Export-Csv -Path $reportPath -NoTypeInformation