Mike Street's Blog & Notes https://www.mikestreety.co.uk Blog posts, notes and links from Mike Street (mikestreety.co.uk) en-gb Tue, 18 Mar 2025 08:42:02 GMT Tue, 18 Mar 2025 08:42:02 GMT https://www.mikestreety.co.uk/assets/img/favicon-512.png Mike Street's Blog & Notes https://www.mikestreety.co.uk 144 144 Lead Developer and CTO Delete all git tags from a project https://www.mikestreety.co.uk/blog/delete-all-git-tags-from-a-project/ Tue, 18 Mar 2025 00:00:00 GMT https://www.mikestreety.co.uk/blog/delete-all-git-tags-from-a-project/ <![CDATA[

To delete the git tags on the remote (e.g. Github or GitLab) you need to have the tags locally - as it uses the local list.

Ensure you have the tags locally by running git fetch origin, you can then run git tag to confirm there are tags there.

Removing the tags from remote can then be done with:

git push origin --delete $(git tag -l)

This passes the result of git tag -l into the --delete parameter.

To delete locally, you can run:

git tag -d $(git tag)

Read time: 1 mins

Tags:

]]>
Test isolated HTML with Playwright https://www.mikestreety.co.uk/blog/test-isolated-html-with-playwright/ Tue, 11 Mar 2025 00:00:00 GMT https://www.mikestreety.co.uk/blog/test-isolated-html-with-playwright/ <![CDATA[

When you read about Playwright, a lot of the examples show testing static site or JavaScript powered applications and isolating components within them.

However, Playwright has many more applications beyond React-based websites and can help test monolithic or traditional LAMP-based websites (thing Wordpress or TYPO3).

I've covered before about testing the front-end of a TYPO3 project, however those tests required an accessible application with PHP and a Database accessible.

What if you wanted to test parts of your application as part of a CI without spinning up a whole server?

Reasoning

The general principle behind our CI tests is isolating HTML while using the applications bundled JavaScript. We decided to include the full JS file for two reasons:

  1. Our bundler (webpack, in this instance) converts ESM-based JavaScript to be cross-browser
  2. This more simulates the "real-world" and allows us to check that a JavaScript change elsewhere hasn't broken some functionality

Isolating HTML

The first step is to isolate the HTML from your application which is specific to the bit of JavaScript you wish to test. Although we including the full JS bundle, we only want to test specific functionality.

We then hard-code the expected HTML in our test. A second benefit to doing it this way is we have a record of what the expected HTML is. This means if the code changes via the CMS or a developer and fails in real-world, we have a record as to why.

Create a new test and use the setContent function (setContent in the Playwright docs) on the page object to create a page element

Tip: Browsers will add a <head> and <body> element if they don't exist so, unless your JavaScript explicitly requires these, you can omit them from your HTML

import { test, expect } from '@playwright/test';

test('Checks the site selector correctly updates & navigates when isolated',  async({ page }) => {
	// Set the HTML
	await page.setContent(`<div class="alert"></div>`);
});

Adding JavaScript

The next thing we do is load the JavaScript. We do this using the addScriptTag function (addScriptTag in docs) specifically using the path attribute.

This takes a JavaScript file and loads it into the page itself - this means the JS file doesn't need to be "accessible" on a URL and helps keep the test contained

import { test, expect } from '@playwright/test';

test('Checks the site selector correctly updates & navigates when isolated',  async({ page }) => {
	// Set the HTML
	await page.setContent(`<div class="alert"></div>`);

	// Load the JS
	await page.addScriptTag({
		path: 'app/sites/site_package/Resources/Public/JavaScript/core.js',
	});

	await expect(page.locator('.alert')).toHaveClass('alert-dismissed');

});

The path is relative to your playwright.config.ts (generally the root of your project)

From there you can run the normal expect() function to test your JS

Grouping

Our convention is to group similar tests with a test.describe with one that tests isolated HTML like the above and the second testing on the website itself.

The isolated test has an additional tag of @ci - this allows us to run only the tagged tests in our pipeline with the following:

npx playwright test --grep @ci

Our two tests would look something like this:

import { test, expect } from '@playwright/test';

test.describe('Alert test', () => {

	test('Test in isolation isolated', { tag: ['@ci'] }, async({ page }) => {
		// Set the HTML
		await page.setContent(`<div class="alert"></div>`);

		// Load the JS
		await page.addScriptTag({
			path: 'app/sites/site_package/Resources/Public/JavaScript/core.js',
		});

		await expect(page.locator('.alert')).toHaveClass('alert-dismissed');
	});

	test('Test on the site', async({ page }) => {
		await page.goto('https://www.mikestreety.co.uk/');

		await expect(page.locator('.alert')).toHaveClass('alert-dismissed');
	});
});

If the tests on the site & isolation were exactly the same, we could extract to a function and run it in both instances.

Read time: 4 mins

Tags:

]]>
Delete Docker images from a certain vendor https://www.mikestreety.co.uk/blog/delete-docker-images-from-a-certain-vendor/ Fri, 21 Feb 2025 00:00:00 GMT https://www.mikestreety.co.uk/blog/delete-docker-images-from-a-certain-vendor/ <![CDATA[

Using Renovate to update your dependencies is a great way of automating upgrades. However, using the Docker image can quickly fill up your CI server or machine.

With their rapid release schedule, a day can see several new versions appearing. We have Renovate running every 2 hours which, as Renovate updates itself, could see 6 new Docker images downloaded a day (Renovate make the version upgrade one run and then merges it the next).

As we have NPM, Composer, Docker and Gitlab CI dependencies updated by Renovate, we find outselves using the -full image which, uncompressed, is over 6GB.

Becuase of that, we now have the following command running weekly:

docker rmi `docker image ls | egrep "^renovate/" | awk '{print$3}'`

This finds all the images that start with renovate and deletes them. When Renovate next runs, it will pull down the image it needs.

Read time: 1 mins

Tags:

]]>
Set up Xdebug with DDEV and VS Code https://www.mikestreety.co.uk/blog/set-up-xdebug-with-ddev-and-vs-code/ Fri, 10 Jan 2025 00:00:00 GMT https://www.mikestreety.co.uk/blog/set-up-xdebug-with-ddev-and-vs-code/ <![CDATA[

I don't set up Xdebug regularly enough to remember all the steps and processes in place. These steps are documented really well in the DDEV Documentation however with those needing to cater to the many, I sometimes get waylaid or confused finding the steps for me.

Before starting, make sure you have installed the PHP Debug VS Code extension.

  1. In the terminal run code .vscode/launch.json .vscode/tasks.json
  2. In launch.json file, paste in the contents of the launch.json file (see below)
  3. In tasks.json file, paste in the contents of the tasks.json file (see below)
  4. Press F5 (or navigate to the Debug panel and click the ▶️ button)

Using the tasks.json this should start xdebug in the ddev container (ddev xdebug on) and you should be able to start debugging.

For any further configuration or documentation, check out the DDEV docs.

Files

File contents copied here for ease/speed

launch.json

{
    "version": "0.2.0",
    "configurations": [
        {
            "name": "Listen for Xdebug",
            "type": "php",
            "request": "launch",
            "hostname": "0.0.0.0",
            "port": 9003,
            "pathMappings": {
                "/var/www/html": "${workspaceFolder}"
            },
            "preLaunchTask": "DDEV: Enable Xdebug",
            "postDebugTask": "DDEV: Disable Xdebug"
        }
    ]
}

tasks.json

{
    "version": "2.0.0",
    "tasks": [
        {
            "label": "DDEV: Enable Xdebug",
            "type": "shell",
            "command": "ddev xdebug on"
        },
        {
            "label": "DDEV: Disable Xdebug",
            "type": "shell",
            "command": "ddev xdebug off"
        }
    ]
}

Read time: 2 mins

Tags:

]]>
Use local filesystem for Gitlab CI cache https://www.mikestreety.co.uk/blog/use-local-filesystem-for-gitlab-ci-cache/ Tue, 07 Jan 2025 00:00:00 GMT https://www.mikestreety.co.uk/blog/use-local-filesystem-for-gitlab-ci-cache/ <![CDATA[

Despite happily using Minio to store my runner caches for a few years now, I've been looking for a way to store my global Gitlab CI runner caches on the local filesystem.

My reasons for this are twofold, one being infrastructure cost, meaning we only need to pay for and maintain one VPS (as opposed to one for MinIO and one for Gitlab CI) and the other being speed - this is just a hunch but storing caches locally is probably quicker than uploading and downloading to a different server.

I did consider using AWS for my Gitlab runners & runner cache, however the big unknown is the cost. I have no clue as to how much my runners and storage would be and you hear so many horror stories I have steered clear.

Instead, I have a VPS on Hetzner which costs:

  • €7.05 a month for a 3 CPU / 4 GB RAM / 80 GB HDD VPS
  • ~€0.58 a month for an IP4 address
  • €2.64 a month for the 60GB volume I have mounted for the cache

I decided to include an extra mounted volume to store the cache to allow a bit more flexibility and isolate the caching.

Local Caching

Local caching isn't really mentioned in the Gitlab docs - it is referenced but never explicitly laid out, so I had a lot of guessing as to how to do it and what goes where.

The below talks about the Docker executors and runners, but I assume it could work for the other ones two.

There are 2 caches when using a runner, the runner cache and the runner.docker cache. From what I can gather, the runner cache is for the actual filesystem, while the path used in the runner.docker section is where assets from the cache: section of your .gitlab-ci.yml get's stored.

Setup

I had a lot of frustration setting this up, but got there in the end - the system I cam up with in the end is

  1. Decide where you cache is going to live - mine was in /mnt/HC_Volume_1234 (my mounted drive) - I then made a cache folder inside however Gitlab CI likes it being /cache.
    • Inside my cache folder, I made runner and docker as sub-folders to help separate the caches
  2. If it's not /cache, symlink your cache folder to be /cache in the root of your server - ln -s /path/to/folder /cache
  3. Register your runner
  4. Edit your config.yaml

Adding the cache_dir

From the standard runner registration, these were the things I had to add/change

[[runners]]
  cache_dir = "/cache/runner"
  [runners.docker]
    privileged = true
    volumes = ["/var/run/docker.sock", "/cache:/cache"]
    cache_dir = "/cache/docker"

The thing that caught me out is you need to specify "/cache:/cache" in the volumes for the docker runner, although the Gitlab docs say you can just do "/cache" this didn't seem to work for me.

Note the two different cache_dir locations for the two different types.

Gitlab CI also expects/plays nicer if the folder is /cache in the docker runner - again I tried setting this as my mounted drive (or other folders) but it just wasn't playing ball.

With that in place - and added to as many runners as you want, they can all access the same cache from your local drive.

If your drive does start to fill up, you can nuke the runner and/or docker cache folders - it might be worth having this on a scheduled task once a week or similar.

Read time: 3 mins

Tags:

]]>
2024 In Review https://www.mikestreety.co.uk/blog/2024-in-review/ Tue, 31 Dec 2024 00:00:00 GMT https://www.mikestreety.co.uk/blog/2024-in-review/ <![CDATA[

On the face of it, when reflecting back on 2024, it seemed a bit of an uneventful year. No children were born, no major world events happened and I didn't even buy a new bike.

However when I looked back through my photos and had a proper think about what happened, it turned out to be a pretty good year. Plenty of adventures with the family, I went on a plane for the first time and long time and I did buy a bike related thing.

Trips out & holidays

2024 saw a couple of holidays, plenty of days out to musicals and theme parks (it helps that we're Merlin annual pass holders) and I also rediscovered going to gigs (and hope to go to a few more in 2025).

As a family we went to Peppa Pig World, Drusillas, Legoland (twice, once at Halloween and the other at Christmas), Chessington and the Sea Life Centre (Alfie got a bonus trip to Sea Life Centre twice and another to Chessington in the school holidays).

My wife and I went and saw Hamilton and Stranger Things - The First Shadow. We also took Alfie to his first show by taking him to see the Frozen musical in London.

There were only a couple of gigs I went to at the end of the year. Embrace were doing a 20th anniversary tour of their "Out of Nothing" at Chalk in Brighton. My obsession with Bastille this year was realised and I managed to get tickets to their "&" tour at the Shepherds Bush Empire, along with seeing Dan perform some songs at Resident, a small record shop in Brighton.

Holidays this year took us to Wales and to Cornwall. In Wales we visited Bluestone, which is like Center Parcs. The bonus is you get to drive golf carts to get around the site. Cornwall was a family holiday with the 4 of us. It had it's ups and downs (and British holiday mentality with beach visits when it was cloudy) nut ultimately was a great 7 days.

Home

At the beginning of the year we had to deal with a snail infestation in our fish tank. From one, seemingly innocent snail, we ended up with hundreds (if not thousands) of snails all over the tank. We tried all different techniques but the key was a Snailcatcher paired with an assassin snail. The snail is still around (and we think it feeds on some of the shrimp), but we haven't seen another snail since.

With the start of rest days at work (more on that in a minute) I found myself with more time at home without the family, which let me get on with some DIY bits which weren't pressing. I moved and fenced off the compost bin (so I didn't have to look at it) and replaced the rotting wooden shed with a smaller, plastic one.

After endless leaks, our fibreglass kitchen roof was replaced with an asphalt one by the original builders (who had been coming back to patch the leaks). It was no fault of theirs, the material just didn't bond well, nor did it like the English South-Coast weather.

We swapped the kids bedrooms round this year. We bought Alfie a cabin bed which means he has more space in his room to play - something he noticed his friends having. We also took the kids to IKEA for the first time which was an experience!

Work

Work was a roller-coaster in 2024 and ended wih a bit of a squeeze. There were some highlights, however, as we introduced the 9-day fortnight alongside a health cash plan for employees. The 9-day fortnight was well received, with employees getting to select a day over a 2 week period which they can have off as a "rest day". There are some rules and boundaries around it but it seems to be respected and thoroughly enjoyed.

We had a little bit of staff turnover this year (not as much as 2023, mind) with one of our senior backend developers, Zaq, leaving. It was amicable, but he is missed. Autumn saw us hire two developers in his place - a Junior and a Mid.

I did manage to get a "work trip" squeezed in over the summer. In August I flew to Germany for my first TYPO3 Developer Days conference. It was great to meet some of the TYPO3 developer community, along with having a few days away (and getting on a plane for the first time in a long time).

Stats Analysis

2024 saw a good cadence of blog posts (I judge more than one a month to be healthy) but saw only one post from 2024 make it into the top 10. That post about migrating Gitlab has been in the top spot since it's creation and I don't see it moving anytime soon.

Beer reviews, once again, took a little step down. This is slightly fuelled by health (drinking slightly less), partly by money (craft beers are expensive) as well as starting to find a groove of beers I like and drinking less new ones.

With cycling, I've finally divided out eBike rides from non-electric rides to help see where the bulk of the stats are coming from - I was surprised to see my non-eRides being higher than my eBike. Number of rides took a dip but the distance stayed steady.

I purchased a turbo trainer at the end of the year, so I am expecting my distance to be significantly higher next year - I might have to see if I can separate out the "real" miles with the Zwifting ones.

Walking took an increase as I got back into Geocaching. At the beginning of the year I managed a 17 day streak and also discovered Adventure Labs - virtual geocaches where you visit real-world places to answer questions.

A generally active year and I hope it continues.

Read time: 4 mins

Tags:

]]>
2024 Quiz of the Year https://www.mikestreety.co.uk/blog/2024-quiz-of-the-year/ Mon, 23 Dec 2024 00:00:00 GMT https://www.mikestreety.co.uk/blog/2024-quiz-of-the-year/ <![CDATA[

Every year I run a quiz for our friends on Christmas Eve and thought I would share it. If you wish to run this quiz, you will need:

  • The slides (linked below)
  • The info and notes below
  • Spotify (or other music platform)
  • Pens and paper for your teams

The quiz can be played in teams or individually - I'll leave it to you to work it out.

This year there does need to be a quiz master due to the music round, however you can omit this if you all want to play.

Slides

Get the quiz slides

The slides are on Google, however if you need them in a different format, let me know.

Quiz Information

This quiz is 5 rounds with 7 questions in each round.

When running the quiz I ask that phones are put away - more for politeness than fear of cheating. I also make it clear that the answers in the quiz are always right - even if they are not. This way it is fair and should hopefully avoid arguments.

Round explanations

1. Herd Mentality

If you have played the "Herd Mentality" game then you understand this round. It is about guessing what everyone else will answer with.

You will be asked to name something and you are trying to guess what the majority will write. The first slide is an example question to practice:

"What is the best chocolate bar?"

You write down what you think everyone else will (Double Decker, right?) and, when ready, all reveal your answers. If there is a majority of the same answer those people (or teams) score a point. If there is a tie, no-one gets a point.

If you need more assistance, read the Herd Mentality rules.

2. Music Round

This is about remembering exactly what the song titles are. You are given the artist and the song is played (feel free to play it all).

Spotify Playlist

The players must right the exact title (including punctuation) to get the point.

3. Swifties

Marketed as a "Taylor Swift" round, this round is actually about all things swift - the bird, car and even the caravan.

Multiple choice, 1 point per correct answer

4. What is this?

I got my 6 year-old to draw things from around the house - what are they?

5. 2024

7 questions about what happened in 2024.

The end

Let me know if you use this quiz and how you get on - was it to easy? to hard? to complicated?

Read time: 2 mins

Tags:

]]>
Slow upload speed with Apple MacOS - how to debug and what to check https://www.mikestreety.co.uk/blog/slow-upload-speed-with-apple-macos-how-to-debug-and-what-to-check/ Thu, 05 Dec 2024 00:00:00 GMT https://www.mikestreety.co.uk/blog/slow-upload-speed-with-apple-macos-how-to-debug-and-what-to-check/ <![CDATA[

Monday morning I tried to upload a screenshot on my Mac Mini and it was taking a lifetime - a quick internet speedtest and my heart sank:

  • Download (Mbps): 341
  • Upload (Mbps): 0.3

My first thought is my Internet Service Provider - I pay for 900mbps up and down (which I get close to with a wired connection) but my WiFi devices tend to get 300-500 Mbps. Seeing that upload speed (as a web developer) nearly brought me to tears.

Checking other devices (and phoning up the ISP), I realised that, in fact, my router was getting the full 900, along with my Android phone and my wife's Windows laptop getting the expected up and download speed. It seemed that both my personal laptop and work Mac Mini were struggling with uploads.

I went on a debugging frenzy - searching holes of Reddit I didn't really want to be. The oddness of which devices was affecting really stumped me.

While I was searching, I came across different fixes for every person which "seemed to work". If you have stumbled across this post as you too are experiencing poor connectivity, then these are the things I've tried and the things that were suggested.

This is by no means an exhaustive list, nor does it tell you how to solve it, but it is worth skim through for some pointers.

Network checking

Is it WiFI?

Plug an ethernet cable into your computer and see if it is just a WiFi issue or if it is your computer interacting with the network.

A specific access point?

Have you got a mesh network? If so, can you connect to another access point to see if it is that causing the issue?

Check the network

If you can, connect to a different network (maybe a friends' or work) to see if it is the device or the network itself.

Review channels & channel width

Check the channels and channel width options for your WiFi networks - can they be optimised? (My Unifi router has the option to optimise these)

Are there other WiFi networks?

Do you have multiple WiFi networks (e.g. a guest or IoT one)? Can you turn off the ones you are not connected to?

Check the frequency band

Can you disable the 2.4Ghz or 5Ghz independently to see if they are interfering with one another?

Local interference?

Is there something which has recently been plugged in or moved near your device which could be interfering?

Mesh power mismatch

If you have multiple access points, are you connecting to the closest one or is another one much further away from you getting in the way?

Check other speed test servers

When running a speed test, does it still happen when you change servers?

Hardware Checking

Check other devices

Are other devices on your same network experiencing the same issue?

Change the router

If you can, switch the physical router for something else, along with any access points or switches along the path

Reboot everything

Your router, your WiFi access points, your computer

Update everything

Your router, your WiFi access points, your computer

Check cabling

Check all the cables going to and from your router

Device checking

VPN

Are you currently connected to a VPN?

Network

Are you definitely on the right WiFi?

Disconnect and reconnect

Forget the network and reconnect

DNS

Do you have any custom DNS servers set on your device or router?

MacOS Settings

Turn off "Low data mode"

System Preferences -> Network -> WiFi -> Details (next to the WiFi name) -> Low Data Mode

Turn off "Limit IP address tracking"

System Preferences -> Network -> WiFi -> Details (next to the WiFi name) -> Limit IP address tracking

Turn off "Private WiFi address"

System Preferences -> Network -> WiFi -> Details (next to the WiFi name) -> Private WiFi address

Lower your MTU (Spoiler, this is what did it for me)

System Preferences -> Network -> WiFi -> Details (next to the WiFi name) -> Hardware

  • Configure: Manually
  • MTU: Custom

For me 1436 was the magic number, it seemed going any higher than this and the upload dropped again.

What the MTU is, I don't really understand, however there were a few blog posts that helped me work out what my MTU should be:

Read time: 3 mins

Tags:

]]>
thisisunsafe - how to bypass Chrome's ERR_CERT_INVALID warning https://www.mikestreety.co.uk/blog/thisisunsafe-how-to-bypass-chromes-err-cert-invalid-warning/ Thu, 14 Nov 2024 00:00:00 GMT https://www.mikestreety.co.uk/blog/thisisunsafe-how-to-bypass-chromes-err-cert-invalid-warning/ <![CDATA[

TL:DR; If presented with a NET::ERR_CERT_INVALID Chrome error then focus the chrome window and type the letters thisisunsafe - the window should refresh with the website.

During a website prelaunch, you may wish to preview the new website on an existing domain. To do this, you can update your host file, flush the DNS cache and open your browser.

Side-note: If you are on a Mac you can do this with:

  • Host file is here: /etc/hosts
  • Flush the computer's DNS cache with: sudo killall -HUP mDNSResponder
  • You may need to go to chrome://net-internals/#dns to flush Chrome's DNS

However, if the ne server is using a self-signed SSL certificate then you are often faced with something that looks like this:

Screenshot of Chrome showing a ERR_CERT_INVALID error

Self-signed certificates occur when Let's Encrypt is used as the SSL provider. The most common way for Let's Encrypt to issue a certificate is to be able to access the website on the "live" domain name - which if you are previewing a new environment it won't be able to do. In which case, a self-signed certificate is often used.

Chrome, by default, prevents you from accessing sites with a self-signed or invalid SSL (blah blah, security) and, instead, displays the page shown above with no obvious way to bypass.

There is a way around this, however, by typing thisisunsafe.

It goes without saying (despite me now saying it), you should only do this if you trust the website and server. Using a Chrome extension like Website IP allows you to see which IP you are visiting to ensure it is trustworthy.

To use the "thisisunsafe" workaround

  1. Click the Chrome window to ensure it is active
  2. Type the letters thisisunsafe and wait
  3. The page should refresh with your website in view

Reference link: Chrome: Bypass NET::ERR_CERT_INVALID for development

Read time: 2 mins

Tags:

]]>
Bulk create populated Google Docs from a Google Spreadsheet https://www.mikestreety.co.uk/blog/bulk-create-populated-google-docs-from-a-google-spreadsheet/ Fri, 04 Oct 2024 00:00:00 GMT https://www.mikestreety.co.uk/blog/bulk-create-populated-google-docs-from-a-google-spreadsheet/ <![CDATA[
The original code & concept was adapted from Phil Bainbridge and the code provided in his Bulk create Google Docs from Google Sheet data blog post.

There is often the time, during a website content creation phase, where people have time and resources to spend writing and adapting content, but the new website is not yet set up. During this phase, we opt for writing content in Google Docs, as this prevents anyone being blocked - the clients can continue with content while we configure the CMS. It also means there is content readily available for designers and developers alike.

Using the method below, we create documents for each page of the website. This is generated from a Google Sheet (which is usually generating from a website sitemap/scraping tool).

The script has the ability to "mail merge". Any column titles surrounded by << quotes >> will be replaced with the cell contents. It also has the ability to retroactively update variables/placeholders.

Notes

Some noteworthy features and/or differences to the original

  • Documents will be created in the same folder as the spreadsheet
  • If a document already exists with the same name, it will use that file for any variable updates
  • If the folder contains a document called "Template" if will copy tha as a basis for all documents, otherwise it will make an empty document
  • If the document (or template) contains "variables", these will be replaced
    • Variables are a sluggified version of the column title surrounded by << >> (e.g. Description will be <<description>>)
    • To check what your column name will be, you can use the following: https://slugify.online/
  • If you have a sheet/page called "Log" in your spreadsheet, an log of events will be output so you can track it's progress

Setup

  1. Create a new folder in Google Drive
  2. Create a new spreadsheet in that folder
  3. Populate the first sheet with your document titles and any other (although not necessary, I would advise having at least a Title and Link column)
    • The script will use a Title column as the document name if it exists, otherwise it will use "Row: X" where X is the row number
    • If you want a link to the doc to be generated, add a Link column
  4. Go to Extensions -> App Scripts and paste the below code - click Save
  5. Go back to your spreadsheet and refresh, there should be a Scripts menu item with Create Docs from this Spreadsheet option - click that
    • The process can take a while - about 10 seconds per page. So set it running and got and grab a coffee.

The code

/*
 * This overall script is designed to bulk create Google Docs from data within a Google Sheet.
 */

/**
 * The main script
 */
function createDocsFromSpreadsheet()
{
	// Log starting of the script
	logEvent('Script has started');

	const spreadsheet = getCurrentSpreadsheet(),
		// Get current folder
		folder = DriveApp.getFileById(spreadsheet.getId()).getParents().next(),
		// Get Data sheet (first sheet)
		dataSheet = spreadsheet.getSheets()[0];

	let files,
		template;

	// Assign via destructuring
	[files, template] = getOtherFilesFromFolder(folder, spreadsheet);

	// Fire the create function
	createDocuments(dataSheet, folder, files, template);

	logEvent('Script has ended');

}

/**
 * Get the currently active spreadsheet
 */
function getCurrentSpreadsheet()
{
	var spreadsheet = SpreadsheetApp;
	return spreadsheet.getActiveSpreadsheet();
}

/**
 * Find all the files (except itself)
 */
function getOtherFilesFromFolder(folder, spreadsheet)
{
	// Set up variables
	let list = [],
		template = false,
		files = folder.getFiles();

	// Loop through the variables
	while (files.hasNext()){
		file = files.next();

		// Exclude ourselves
		if(file.getId() === spreadsheet.getId()) {
			continue;
		}

		// Create a object with data
		let f = {
			name: file.getName(),
			slug: slugify(file.getName()),
			id: file.getId()
		};

		// Exclude the template
		if(f.slug === 'template') {
			template = f;
			continue;
		}

		// Keep the rest
		list.push(f);
	}

	return [list, template];
}

/**
 * Create the documents
 */
function createDocuments(dataSheet, folder, existingFiles, template) {

	// Log starting createDocs Function
	logEvent('Starting createDocuments Function');

	// Get the formatted spreadsheet data
	let headers,
		data;
	[headers, data] = formatRows(dataSheet.getDataRange().getValues())

	if(!data.length) {
		return;
	}

	for(let page of data) {
		// Create a file name
		let fileName = page.title ? page.title : 'Row: ' + page.row;

		// Find or create a new file (maybe from the template)
		logEvent('Looking for: ' + fileName);
		let file = getOrMakeFile(fileName, existingFiles, template, folder)

		if(!file) {
			continue;
		}

		// Pppulate any variables - even if it's an existing sheet
		let fileId = file.getId();
		populateTemplateVariables(fileId, page);

		// Get the column with a title of "Link"
		let linkColumn = (headers.map(a => a.slug)).indexOf('link');
		if(linkColumn >= 0) {
			// If it exists, add the URL
			dataSheet.getRange(page.row, (linkColumn + 1)).setFormula('=HYPERLINK("' + file.getUrl() + '","' + fileName + '")');
		}

		// refresh spreadsheet to links appear as soon as added
		SpreadsheetApp.flush();
	}
}

/**
 * Find an existing file or make a new one
 *
 * If a "Template" file exists, use that
 */
function getOrMakeFile(fileName, existingFiles, template, folder)
{
	let file = false;

	let matchingFileList = existingFiles.filter(f => f.name === fileName),
		existingFile = matchingFileList.length ? matchingFileList[0] : false;

	if(existingFile) {
		logEvent('Already exists: ' + fileName);
		file = DriveApp.getFileById(existingFile.id)
	} else if(template && template.id) {
		logEvent('Creating from template: ' + fileName);
		try {
			file = DriveApp.getFileById(template.id).makeCopy(fileName, folder);
		}
		catch(e) {
			// if failed set variable as false and Log
			logEvent('Failed to copy the template: ' + e);
		}

	} else {
		logEvent('Creating empty file: ' + fileName);
		try {
			file = DocumentApp.create(fileName)
			file = DriveApp.getFileById(file.getId())
			file.moveTo(folder);
		}
		catch(e) {
			// if failed set variable as false and Log
			logEvent('Failed to make a new file: ' + e);
		}
	}

	return file;
}

function populateTemplateVariables(fileId, page) {

	let fileContents = false;

	try {
		fileContents = DocumentApp.openById(fileId).getBody();
	}
	catch(e) {
		// if failed set variable as false and Log
		logEvent('Failed to open file: ' + e);
	}

	if(!fileContents) {
		return;
	}

	for(let key in page) {
		fileContents.replaceText('<<' + key + '>>', page[key]);
	}
}

function formatRows(rows)
{
	let headers = [];

	for(let h of rows.shift()) {
		headers.push({
			title: h,
			slug: slugify(h)
		})
	}

	let data = [];

	// Start at 2 so it matches with with the rows in the sheet
	let rowCount = 2;

	for(let row of rows) {
		let d = {
			row: rowCount
		};

		for (var col = 0; col < row.length; col++) {
			d[headers[col].slug] = row[col];
		}

		data.push(d)
		rowCount++;
	}

	return [headers, data];
}

/**
 * Add the menu Item
 */
function onOpen() {
	SpreadsheetApp.getUi()
		.createMenu('Scripts')
		.addItem('Create Docs from this Spreadsheet', 'createDocsFromSpreadsheet')
		.addToUi();
}

/**
 * Log event (if there is a sheet called Log)
 */
function logEvent(action) {
	// Use the scripts logger
	Logger.log(action);

	// get the user running the script
	var theUser = Session.getActiveUser().getEmail();

	// get the relevant spreadsheet to output log details
	var ss = SpreadsheetApp.getActiveSpreadsheet();
	var logSheet = ss.getSheetByName('Log');

	if(!logSheet) {
		return;
	}

	// create and format a timestamp
	var dateTime = new Date();
	var timeZone = ss.getSpreadsheetTimeZone();
	var niceDateTime = Utilities.formatDate(dateTime, timeZone, "dd/MM/yy @ HH:mm:ss");

	// create array of data for pasting into log sheet
	var logData = [niceDateTime, theUser, action];

	// append details into next row of log sheet
	logSheet.appendRow(logData);

}

/**
 * Convert a string into a slug
 */
function slugify(str) {
	str = str.replace(/^\s+|\s+$/g, ''); // trim leading/trailing white space
	str = str.toLowerCase(); // convert string to lowercase
	str = str.replace(/[^a-z0-9 -]/g, '') // remove any non-alphanumeric characters
		.replace(/\s+/g, '-') // replace spaces with hyphens
		.replace(/-+/g, '-'); // remove consecutive hyphens
	return str;
}

Read time: 11 mins

Tags:

]]>
Fixing a failing Docker layer https://www.mikestreety.co.uk/blog/fixing-a-failing-docker-layer/ Mon, 23 Sep 2024 00:00:00 GMT https://www.mikestreety.co.uk/blog/fixing-a-failing-docker-layer/ <![CDATA[

TL:DR;

Problem: I was getting a failing layer push to a Docker registry, even when the image was built without a cache.

Solution: It ended up being a layer which was too big and timing out - I identified the problem layer with Dive and split out my RUN instructions.


Too Long Did Read

Recently, on a project which deploys via Docker, I started getting a Docker layer which wouldn't push. Nothing significant had changed with the Docker file, image or the base files in the repository.

It manifested itself in the logs (both locally and via Gitlab CI) with a forever looping retry:

ec7ca1533d1c: Retrying in 5 seconds
ec7ca1533d1c: Retrying in 4 seconds
ec7ca1533d1c: Retrying in 3 seconds
ec7ca1533d1c: Retrying in 2 seconds
ec7ca1533d1c: Retrying in 1 second
ec7ca1533d1c: Retrying in 10 seconds
ec7ca1533d1c: Retrying in 9 seconds
ec7ca1533d1c: Retrying in 8 seconds
ec7ca1533d1c: Retrying in 7 seconds
ec7ca1533d1c: Retrying in 6 seconds
ec7ca1533d1c: Retrying in 5 seconds
ec7ca1533d1c: Retrying in 4 seconds
ec7ca1533d1c: Retrying in 3 seconds
ec7ca1533d1c: Retrying in 2 seconds
ec7ca1533d1c: Retrying in 1 second

Each time it got down to 1 second a failed, the starting number would increase. Eventually, the whole process failed.

I originally thought it would be a caching issue, so I cleared CI caches, removed remote containers & built my image with the --no-cache option on the CLI. However, none of this seemed to make a difference.

Speaking to a colleague, he mentioned it was often layer size which timed out and prevented pushing. Upon doing research, I found a tool called Dive, which allowed you to inspect each layer (and the filesystem differences) of a Docker image: See Dive on Github.

It took me a while to identify the problem layer, as the order they appeared in the logs was not necessarily the order they were built. I also hadn't clocked that the first 12 characters in the logs were the first 12 characters of the sha256 hash of each layer. It seems obvious now, but when you're in the rage haze, it was easy to overlook.

Dive lists out the "Digest" (sha256) of each layer and I found that if you had the first 12 characters in the terminal search, they immediately highlighted when you got to the layer.

I identified the problem layer being the one where I update and install dependencies in our base Docker image:

### Install required packages
RUN DEBIAN_FRONTEND=noninteractive apt-get update -y && \
	DEBIAN_FRONTEND=noninteractive apt-get upgrade -y && \
	DEBIAN_FRONTEND=noninteractive apt-get install -y \
	apache2 \
	apg \
	brotli \
	bzip2 \
	catdoc \
	cron \
	curl \
	default-mysql-client \
	exim4 \
	gawk \
	gifsicle \
	git \
	htop \
	imagemagick \
	jpegoptim \
	less \
	locales \
	nfs-common \
	ntp \
	php7.4-common \
	php7.4-curl \
	php7.4-fpm \
	php7.4-gd \
	php7.4-intl \
	php7.4-json \
	php7.4-mbstring \
	php7.4-mysql \
	php7.4-simplexml \
	php7.4-xml \
	php7.4-zip \
	pngcrush \
	poppler-utils \
	rsync \
	snmp \
	sudo \
	supervisor \
	sysstat \
	vim \
	webp \
	wget \
	zopfli

My thinking behind combining the update & installs in one, ath the time, was to reduce the amount of layers Docker created. I didn't consider the size of each layer ever being an issue.

With some trial an error of splitting up commands, I eventually landed on something like the following. I didn't identify exactly what package was causing the issue (it was late night), but splitting it up into 4 sections seemed to create small enough images that they could be pushed.

It's worth noting were are looking to deprecate this Docker image due to performance, so I didn't want to sink too much time into something which will be replaced soon

RUN DEBIAN_FRONTEND=noninteractive apt-get update -y
RUN DEBIAN_FRONTEND=noninteractive apt-get upgrade -y
RUN DEBIAN_FRONTEND=noninteractive apt-get install -y \
	apache2 \
	apg \
	brotli \
	bzip2 \
	catdoc \
	cron \
	curl \
	default-mysql-client
RUN DEBIAN_FRONTEND=noninteractive apt-get install -y \
	exim4 \
	gawk \
	gifsicle \
	git \
	htop \
	imagemagick \
	jpegoptim \
	less \
	locales \
	nfs-common \
	ntp
RUN DEBIAN_FRONTEND=noninteractive apt-get install -y \
	php7.4-common \
	php7.4-curl \
	php7.4-fpm \
	php7.4-gd \
	php7.4-intl \
	php7.4-json \
	php7.4-mbstring \
	php7.4-mysql \
	php7.4-simplexml \
	php7.4-xml \
	php7.4-zip
RUN DEBIAN_FRONTEND=noninteractive apt-get install -y \
	pngcrush \
	poppler-utils \
	rsync \
	snmp \
	sudo \
	supervisor \
	sysstat \
	vim \
	webp \
	wget \
	zopfli

The main reason for this post is to (hopefully) help someone. I spent a few hours hunting round the internet for similar issues and no-one mentioned it could be the size of your Docker layers.

Read time: 3 mins

Tags:

]]>
A summary of TYPO3 Developer Days 2024 https://www.mikestreety.co.uk/blog/a-summary-of-typo3-developer-days-2024/ Mon, 12 Aug 2024 00:00:00 GMT https://www.mikestreety.co.uk/blog/a-summary-of-typo3-developer-days-2024/ <![CDATA[

Just over a week ago, I returned from Germany after attending the multi-day, multi-tracked TYPO3 conference: TYPO3 Developer Days. This conference marked several firsts for me: my first TYPO3-specific conference, my first visit to Germany, my first time traveling alone, and my first time drinking Pilsner for four consecutive evenings (for those who don't know, I review beer in my spare time).

Side note: I found the unfiltered Pilsner had a much more complex and tasty flavor compared to the straight filtered beer.

Getting there

The conference took place in Karlsruhe, in the south-west of Germany. Although this event brought me to Germany, I didn't see much of the country. My journey consisted of a 6:15am 3-hour National Express ride to Heathrow airport, a wait, a flight and another wait followed by a taxi and train to reach the hotel. I did, however, manage to briefly explore the woodland opposite the hotel, searching for (and finding) a couple of Geocaches after breakfast one morning: Oberwaldtradi - OWT #02 🌲🌳 and Oberwaldtradi - OWT #04 🌲 🌳.

We stayed at the Genohotel, which offered great spaces for socializing and drinking the aforementioned Pilsner, along with a lovely outdoor area that prevented feeling cooped up inside. The food was amazing; I enjoyed every meal and experienced plenty of new dishes, though I couldn't tell you what most of them were called as all the labels were in German.

The conference

The conference itself was nothing short of inspirational. Most talks, as expected, focused on TYPO3 itself, its capabilities, and upcoming features. While you do come away feeling like you've been "drinking the Kool-aid" a little, even after the honeymoon phase, I still find myself excited about many cool features coming in future releases.

Two TYPO3-centric talks that particularly excited me were Benjamin Franzke's presentation about upcoming Site Sets and Simon Praetorius' talk on the Vite extension and plugin.

With Site Sets, I've already encountered a couple of scenarios in the last week where they would have been beneficial, mainly around multi-site setups and sharing configuration and plugins.

Vite is something we've been considering at Liquid Light, as we've been using Gulp for over 10 years (Advanced Gulp File is a blog post I wrote when we'd just migrated). Despite Gulp 5 recently being released, we've been looking to optimize our front-end builds. The TYPO3 extension and accompanying NPM package Simon presented are essential for making this switch.

As for non-TYPO3 focused talks, Zack Lott gave a great overview of security scanners and bravely conducted a flawless live demonstration. It provided much food for thought, and I've added Semgrep and Trivy to my to-do list.

Christian Heilmann delivered a fantastic, thought-provoking talk on making the web simpler and more accessible.

This isn't to belittle the other talks, of course - every single one I attended had great takeaways. I furiously scribbled notes during each talk, which I wrote up at the end of each day. They might not mean much to anyone else, but they serve as a nudge to remind me what I learned. If you're interested, you can find them here:

The people

The talks are only half of what makes a conference great. The people truly make conferences special. Everyone I met was kind and welcoming, and I hugely appreciate the effort everyone made to speak in English so I could join in the conversations. I was one of six who had travelled from the UK (a team from Prater Raines, TYPO3 Tom, and Zack), and between us, we couldn't speak much German beyond the basics. I was bowled over by how perfectly everyone spoke English.

I got to meet many incredible people, and it was great that the speakers stayed and mingled so we could pick their brains over a few evenings. There nearly as many (undocumented) takeaways and learnings from the evenings as there were during the day.

All in all, it was a fantastic, educational, and welcoming atmosphere. As long as time and circumstances allow, I will certainly be back for future TYPO3 Developer Days.

Other write-ups

Read time: 3 mins

Tags:

]]>
TYPO3 Developer Days 2024: Day 3 - 3rd August 2024 https://www.mikestreety.co.uk/blog/typo3-developer-days-2024-day-3-3rd-august-2024/ Sat, 03 Aug 2024 00:00:00 GMT https://www.mikestreety.co.uk/blog/typo3-developer-days-2024-day-3-3rd-august-2024/ <![CDATA[

TYPO3 Developer Days Day 3 notes.

See other days:

Managing and Developing an Extranet as a TPA - A TYPO3 Page App - Rebecca Düker, Christian Keuerleber

T3DD Schedule Link / Slides / Video (tbc)

  • Built a new extranet
  • Started with MVP
    • Define important functions
    • Understood complexity
  • Routing (routes = paths)
    • Allow bookmarking
    • Allow logging in & redirecting back to original path
    • TYPO3 & API routes sent to TYPO3, everything else to React Router
  • API uses b13/slimphp-bridge
    • use Slimphp for a frontend typo3 request
  • React components for frontend
  • Use DTO (Data Transfer Objects) to communicate between the two
    • Variables are defined and typed early to allow FE and BE to work independently
  • Tips
    • Send the data you need to the FE
    • Only send UIDs back to BE
    • Use HTTP/REST standards
  • Plans can change (as you understand the brief more)
    • Creativity increases as the project progresses
  • Communication & transparency is key
    • Trust as basis
    • Grants access to full potential
  • Access is done in TYPO3 so even the Middleware is blocked from unauthorised requests
  • Imports are CLI tasks so they can be scheduled
  • They set up a middleware application between external resources & TYPO3 which prepares the data
  • Tests are run with Codeception
    • API calls with Guzzle
    • Tests CLI commands
    • FE tests boot up the whole app
  • Monitoring is done using Playwright on Production
    • Runs nightly
  • Composer audit & NPM audits are run on deployment
  • Sanity is used for error logging
  • In NGINX they log the page rendering time which can be used for performance checks

Vite – TYPO3’s nimble frontend companion - Simon Praetorius

T3DD Schedule Link / Slides / Video (tbc)

  • Why bundles?
    • Combine, optimise & bundle into single files
  • Which bundler?
    • Results (compatibility, consistency, optimisations)
    • Developer experience (setup, speed, extensibility)
    • Maintenance (maintainers, community, release cycle)
  • Vite uses esbuild & rollup under the hood
  • Vite has
    • SCSS & PostCSS built in (and more)
    • Bundles references assets (e.g. fonts & images)
    • Built in TypeScript
    • Automatic code splitting
    • Output as ES Modules
  • Hot module replacement
  • Robust file watching & cache busting
  • Lots of plugins, but if there isn't a Vite one you can use Rollup plugins
  • TYPO3 Vite plugin (npm install --save-dev vite vite-plugin-typo3)
    • Configures Vite (uses composer.json to find extensions)
    • Uses Configuration/ViteEntrypoints.json for paths
  • Vite TYPO3 Extension (composer req praetorius/vite-asset-collector)
    • Production/Dev context switching
    • ViewHelpers to embed assets
  • manifest.json file generated by Vite which TYPO3 consumes
  • Get started
    • Install extension
    • Install NPM plugin
    • Configure Vite
    • Setup entry points
    • Use ViewHelper
    • Start Vite server
  • Best practices
    • Import all JS & CSS for a plugin together
    • Use glob to find all the files (set {eager: true})
    • npm add -D sass
    • Each extension gets an alias based on TYPO3 extension name (e.g. @site_package)

Responsive Images - Helmut Hummel

T3DD Schedule Link / Slides (tbc) / Video (tbc)

  • Why bother?
    • Performance
    • Sustainability
    • SEO
  • Features of browsers
    • Preload scanner (scans for resources before DOM is parsed)
    • Tags and attributes
      • srcset - multiple image sizes and resolutions, browser picks. Screen ration & device pixel ratio is taken into account
      • If you start small and go bigger, it will load the next bigger image
      • If you start big and go small, it will just keep the big imgae
      • srcset has sizes attribute - tells the browser how much space the image will use
    • picture and source tags - for when you need art direction
      • use srcset inside source tag
      • picture and source can't be styled, you style the img tag instead
  • TYPO3 extension - Top Image
    • Declarative configuration (single source of truth)
    • Close to web platform specification
    • ViewHelper for rendering
    • Includes an API
    • Configuration is done by PHP
    • Has debug mode to put images with text information instead of your original image
    • Repo has an example extension within the tests repo
  • In Chrome dev tools, if you hover over a srcset it will tell you the current source

Let's make a simpler, more accessible web - Christian Heilmann

T3DD Schedule Link / Slides / Video (tbc)

  • We've made the web for developers, not users
  • Why are we looking at spinners and loads and not content?
  • The web is built on resilient tech (HTML CSS)
    • Things that are wrong get discarded
  • But we've chosen the brittle one (JS)
  • Why?
    • Lack of control feels weird
    • Impatience - the web tech evolution felt slow
    • Perceived complexity
    • IESafari
    • Market push
  • 20 years ago, Chris wrote a blog post about "Unobtrusive Javascript"
  • The web used to be HTML & CSS and is now super complicated
  • Facts
    • Browsers are constantly updated
    • Web standards process is faster
    • We don't all need to build killer apps
    • Our goal should be satisfied users
    • Browsers are good at optimising UX
      • They can't break the web as they are the portal to the web
    • Browsers are just marketing tools to get you using the companies other tools
  • The web should offer user preferences (dark mode, high contrast)
  • Optimise what you can control
  • You cannot break HTML - send loads of it
  • Use the right formats (WebP, Avif)
  • Use closer servers
  • Don't put things in your machine that you don't understand
  • Caching - Offline first
  • Remove old libraries and polyfills
  • Don't bundle things that are only used once
  • Think about what you are including and why
  • Don't put images in CSS - put them in HTML where they belong so they can be loaded correctly
  • You can do object-fit with video
  • Put lazy loading on images & defer on script tags
  • When searching on a page, <details> will auto expand of the text is inside
  • CSS property colour-scheme: light dark; (MDN link)
  • Stop blaming browsers
  • Developer tools in webkit allow you to replicate blurred vision & colour blindness

QA in TYPO3 - CI in a community-driven open-source project - Markus Klein, Christian Kuhn

T3DD Schedule Link / Slides (tbc) / Video (tbc)

  • Why does QA matter?
    • CI is key for open source - if nothing is automated, your code won't get to a viable state
    • Learning
  • TYPO3 has over 670k lines of code
  • Aspects of CI
    • Lots of different platforms
    • Developers all over the world, but no contracts, so no "higher power"
    • There are guidelines, they can be violated, but the system needs to monitor these and catch these
  • Challenges
    • Maintain stability
    • Enhance code quality
  • Strategy
    • Deliver feedback as quick as possible
    • Easy reproducibility to local dev env - build an infrastructure where you can run it locally
    • Be deprecation free
    • Have each patch fully tested before merging
    • Run nightly tests which have more test permutations & edge cases (take longer to run)
    • If nightly test fails, the fix is prioritised
  • What is tested
    • Unit,
    • Functional and Integration which use a real database
    • End-to-end uses headless browser
    • Static code scanning
    • Integrity checks (commit checks, exception code checks, composer integrity)
  • Current process
    • Gerrit give you a linear history
    • Gerrit makes a new branch in Gitlab which runs the pipeline
    • CI load split over multiple runners
    • TYPO3 maintain their own images for running tests
  • Security team creates patches privately and merges on release day - pipeline needs to be able to run CI and deploy security patches as quickly as possible
  • Test often and early
  • Converting to Podman improved CI performance
  • CI provides a safeguard for new developers and the code

Other Talk Resources

Read time: 5 mins

Tags:

]]>
TYPO3 Developer Days 2024: Day 2 - 2nd August 2024 https://www.mikestreety.co.uk/blog/typo3-developer-days-2024-day-2-2nd-august-2024/ Fri, 02 Aug 2024 00:00:00 GMT https://www.mikestreety.co.uk/blog/typo3-developer-days-2024-day-2-2nd-august-2024/ <![CDATA[

My notes, links and useful points from second day of TYPO3 Developer Days.

See other days:

Our quest for ACL improvements in TYPO3 Core - Tomasz Woldański

T3DD Schedule Link / Slides (tbc) / Video (tbc)

  • No change for TYPO3 user permissions over the last few years
  • Lots of little things can make a big difference
  • Survey results
    • Missing best practices
    • Complex UI/UX
    • Deployable permissions
  • Best practices
    • Avoid setting permissions on a user
    • Have a login for every user (no sharing)
    • Create a different BE User group for each category/role
      • System: User groups for file & database mounts
      • ACL: Content & page permissions
      • Role: No permissions per se, but inherits from all the others
  • Documentation was updated
  • Addition of creating BE user group on initial site setup (v13.1)
  • Addition of CLI commands to create predefined user groups
  • UX Improvements for ACL
    • Split record & modal permissions into different tabs
    • Combined Access with User permissions
    • Searchable fields in the exclude fields permissions
    • Combined read & write (view & modify) permissions into a nicer table
    • Can add & edit users in a when editing the group
  • Extension presets - predefine different roles & permissions in your extension to be loaded
  • Deployable permissions most likely to be v14

Innovating Integration: A Case Study on B2B with TYPO3 Headless - Łukasz Uznański

T3DD Schedule Link / Slides (tbc) / Video (tbc)

Because this was a case study, it was demonstrating what was achieved so there weren't too many notes.

  • Improved their sales process - used TYPO3 as a content hub
  • Vue.js frontend and using TYPO3 headless
  • Choose the right tool for the job
  • Vue front end pulls in different data from different services
  • Nuxt - authenticates with TYPO3 which then connects to Magento
  • Only Magento UID & user group is stored - this allows restricting of content & showing different promotions to different groups - also helps with GDPR

Language Overlay - How it works - Benni Mack

T3DD Schedule Link / Slides (tbc) / Video (tbc)

  • Overlays are used for languages & workspaces
  • Context API has language and workspace state/aspect
  • Language - started as TypoScript but is now a single source of truth in the site config which can be used in FE and BE. This contains all the language config
  • l10n_parent field is used everywhere for localisation config, except tt_content which uses i18n_parent
  • When loading a different language, the default lang record is loaded and then the translation - all the fields are then replaced except the uid. The UID of the translated page is put into _OVERLAY_UID
  • If no translation is found, the fallback chain is referred to
  • Overlay replaces all the data except the UID because:
    • All links point to the default language
    • PID is always the default language
  • Fallbacks are essentially overlays but in the other direction
  • Workspaces
    • t3ver_wsid - What workspace is this?
    • t3ver_oid - The ID of the original/online/live page
    • t3ver_state - Is this a change/deletion/addition
  • When viewing the FE of a workspace, every content & page record is checked for a workspace overlay
    • If a version is found, every field is replaced except PID and UID
  • When loading a language in a workspace, several hops are made:
    1. Live, Default lang
    2. Versioned (workspaced), Default lang
    3. Live, translated version
    4. Versioned, translated version
  • Why is it complicated?
    • Historical reasons
    • No better solution
    • TYPO3 are trying to make it less complicated
    • Keeps the sorting & position accross translations
    • In an ideal world, overlays and sys_language_uid wouldn't be needed
    • Saves space (instead of duplicating the DB, only needed records are made)
      • But does mean more queries
  • Just use the APIs
  • PageRepository has uses the Context API
    • Access Workspace & Language overlays in FE
  • cObj uses PageRepository
  • PageRepository has plenty of PSR-14 events to use
  • BackendUtility for getting Workspace and Language overlays in BE
  • You can use PageRepository in the BE
  • RelationHandler to read & write related DBs
  • Use DataHandler for writing
  • You always need a default language, but can use the "Hide default language of a page" checkbox in page properties

The SAST and the furious - Zack Lott

T3DD Schedule Link / Slides / Video (tbc)

  • Application security
    • Testing security features
    • Prevents users from doing unauthorised actions
  • OWASP Top 10
  • Normal methods for finding security issues
    • Code reviews
    • Previous experience
    • Own Tools
    • Security tools
  • SAST Scanners
    • Analyse your code, like PHPStan
  • Can run during development locally or on CI
  • Doesn't require infrastructure like database
  • Lean on the experts to find issues with predefined rules
  • Semgrep
    • Open source
    • PHP & JS
    • Custom rules
    • Integrate pipelines & run locally
    • Has a library of rules
    • brew install semgrep
  • Supply chain attacks
    • Targets third party vendors (e.g. Crowdstrike)
  • Do you know your dependencies & sub dependencies and if they have CVEs?
  • Trivy
    • Scans NPM, Composer, APT & APK and OS
    • Will identify common CVE
    • Local scanning on computer or server
    • Scan against a repo
    • Scan docker images
    • brew install trivy
  • Gitlab requires setting up with YAML, Github you can "add" it
  • Semgrep and Trivy export JSON, SARIF
  • SBOMs list all your dependencies and are sometimes requested

TYPO3-Rector v2 - Henrik Elsner

T3DD Schedule Link / Slides (tbc) / Video (tbc)

  • Upgrades are long and expensive
  • Any problem that arises after an upgrade is always a problem of the upgrade
  • Rector
    • Migrate TYPO3 TCA
    • Classes/Extbase/Icons PHP
    • TypoScript/YAML/Fluid
    • Will also tell you what it can't do automatically
  • Benefits
    • More time for testing
    • Learn changes you didn't know
    • More efficient
    • Keeps knowledge which gets lost
  • Typo3 Rector is a wrapper for Rector
  • Trust Rector
    • Each rule has tests
    • Dry run on the first run
    • Rector detects the class and ensures the methods are ok (unlike scanner in TYPO3)
    • Treat Rector like a junior employee - not a senior
  • Best practice
    • Clean up your files first (delete unused code)
    • Run Rector for the current version you are on
      • E.g. if you are doing 11 -> 12, run it for 11 first to ensure you are up-to-date
    • Run the latest rector first (v2) then run v1 to catch old rules - then run v2 again in case any rule got updated
  • Tips
    • For TCA it needs a ctrl and columns array keys
    • For selects it needs 'type' => 'select' - even in a TCA override
  • Fractor for files
  • Consider running Rector in CI to prevent old code from being copied/used

Securing TYPO3 Web Applications - Oliver Hader

T3DD Schedule Link / Slides (tbc) / Video (tbc)

  • XSS
    • Allow injection of JS
    • Can lead to remote controlling (e.g. key logger or crypto miner)
    • Different types
    • Protect against SVG uploads
  • GET Param
    • htmlspecialchars
    • json_encode
  • TYPO3
    • <f:format.raw> and <f:format.htmlentitiesDecode> do not sanitise
    • Use <f:format.html> or <f:sanitize.html> instead
  • Be aware of securing your JS files (outside of TYPO3)
  • Encode HTML and JSON
  • Use HTML sanitiser (lib.parseFunc)
  • Use SVG sanitiser for uploaded files
  • Apply a Content Secruity Policy
  • Introduce Trusted Types in your JS
  • SQL Injection
    • Allows injection of SQL
    • Could lead to leaking of sensitive data
  • sqlmap - Runs common SQL injection commands
  • Create named parameters when interacting with the DB
  • Use prepared statements
  • Insecure direct object reference (IDOR)
    • Manipulate/retrieve internal resources by knowing identifiers
    • E.g. UIDs, filenames etc
  • Ensure different values can't be used (i.e. changing an ID in a "update" form)
  • Cross-site Request Forgery
    • Tricked into visiting a malicious website
    • Use strict cookies where possible
    • lax is still a bit stricter than none
    • Dissallow GET method for actions (e.g. creation & deletion)
    • Use CSRF tokens where possible
    • Enable "Enforce referrer" in TYPO3
  • File upload
    • Could allow remote code execution
    • Give you a bad site reputation
    • Could allow information disclosure
  • Checks on file uploads
    • File size
    • File extension
    • Mime type
    • Mime type matches file extension

The Art of Deployment - Martin Helmich

T3DD Schedule Link / Slides / Video (tbc)

  • Deployment evolution
    • FTP
    • FileZilla
    • Rsync
    • Version Control (e.g. git pull on live server)
    • Atomic deployments
  • Deployments should be repeatable and automatable
  • Deployments should not cause downtime
  • Deployments should be reversible
  • There is already Gitlab & TYPO3 deployment configuration
  • Atomic Deployments
    • TYPO3 Surf
    • PHP Deployer
  • Code is straight-forward to deploy
  • Database is harder as it is more difficult to rollback - your app needs to be compatible with both version of the DB
  • Automated deployments need quality control
    • Testing (PHPUnit, Jest)
    • Coding Style (CSFixer, Code Sniffer)
    • Type Checking (PHPStan, PSALM)
  • Containers remove environment disparity
  • Helm is deployment for Kubernetes
  • MACH
    • Microservices
    • API first
    • Cloud Native
    • Headless
      • Microservices create more deployment services
      • Which order do you release?
  • Dark launching
    • Launching code behind a feature flag
    • doesn't matter which order you deploy as you enable the feature after deployment
  • Unleash - an open source feature flag service
    • It's what Gitlab uses under the hood
  • openfeature.dev

Other Talk Resources

Read time: 6 mins

Tags:

]]>
TYPO3 Developer Days 2024: Day 1 - 1st August 2024 https://www.mikestreety.co.uk/blog/typo3-developer-days-2024-day-1-1st-august-2024/ Thu, 01 Aug 2024 00:00:00 GMT https://www.mikestreety.co.uk/blog/typo3-developer-days-2024-day-1-1st-august-2024/ <![CDATA[

TYPO3 Developer Days is taking place in Germany over the next few days. As I attend each talk, I've been writing bullet points in my notebook of noteworthy things, things I agree with, things to remember or things to look up later.

The following post is those bullet points in a digital format. They probably won't make sense to anyone, but serve as a nudge for future me and stop them from living and dying in my notebook. They are also my twist and interpretation of what was said - some of it is verbatim, but other notes are what I took from it.

See other days:

Keynote - Benni Mack

T3DD Schedule Link / Slides (tbc) / Video (tbc)

  • Generative AI in a CMS
    • Content generation
    • Coding
    • Translating
  • Headless separates content from design
  • Structured and semantic content is key
  • Get your content straight and then let AI do the heavy lifting of editing, optimising & translating
  • TYPO3 Content blocks
    • Currently an extension but will be in the core in v13
  • TYPO3 SurfCamp built a website with site sets
    • https://github.com/typo3incubator
  • TYPO3 have a11y tests in the pipeline - look into this
  • TYPO3 are extending the LTS by 2 months (from v13)

Migrating from jQuery - Core Journey to Vanilla JS - Andreas Nedbal

T3DD Schedule Link / Slides / Video (tbc)

  • $ replace with documentQuerySelector(All)
  • attr with getAttribute/setAttribute
  • .data('*') - use regex to replace with .datatset.$1
  • Native DOM API has no chaining of methods
  • closest exists in native JS too
  • new RegularEvent('change', function() {}).delegateTo (ref)
  • TYPO3 Backend JS has it's own AjaxResponse class to use instead of $.ajax
  • Lit/LitElements is a wrapper library for web components which is helpful for building dynamic HTML
  • Firefox restricts JS to the frame (Webkit ignores this)

Settings and Configuration Management - Benjamin Franzke

T3DD Schedule Link / Slides / Video (tbc)

  • All settings should have a default
  • Site Sets - Configuration/Sets/[SET NAME]
    • Settings, TypoScript * TSConfig
    • Shareable
  • Configuration/Sets/*/config.yaml
    • Should have a name and label
  • In TYPO3 -> Edit Site Config -> "Sets for this site"
  • Site Sets can inherit other site sets
  • settings.yaml is available in v12 inside the config/sites folder
    • This was then replicated in a site set to centralise settings & allow to be shared
  • Settings are rendered as if they were in constants.typoscript
    • Can access them in TypoScript and Fluid
    • There is also a new method/attribute on a site config (getSettings())
  • Any setup.typoscript and page.tsconfig inside the site set folder will be loaded automatically
  • Site sets are a at a site level (can't load on individual pages)
  • Once using site sets - remove the TYpoScript include (E.g. EXT:form/Configuration/TypoScript/setup) and, instead, import the site set
    • Encourage ext authors to use site sets
  • No more db changes are needed (e.g. sys_template)
  • Sit settings also replace constants.typoscript (although still compatible)

Related Links

TYPO3 Agencies and AI: An Experience Report - Fabian Stein

T3DD Schedule Link / Slides (tbc) / Video (tbc)

  • AI is seen as boring, disruptive and as an uncertainty
  • AI will not replace people in the short/mid-term
  • Clients are highly interested in AI, but they don't know how to use it
  • Use cases
    • Getting information (e.g. FAQs)
    • Help with text editing
    • Generating stock images
  • How can we support use of AI?
    • Tell success stories
    • Workshops - give people time to understand
    • Offer guidance
  • AI is change, and change is scary
  • RAG Workflow
    • Advantages
      • Easy to update
      • Easy to change
      • Many opportunities to use it
    • Disadvantages
      • Difficult to set up
      • Still can generate wrong answers
  • Open Source LLMs
    • Lama 3.1 (still needs a lot of resources)
  • Gaia-X
    • OpenGPT-x
      • Although no public progress since September last year

Testing with Doubles: Why, When, and How? - Sebastian Bergmann

T3DD Schedule Link / Slides / Video (tbc)

This was a more practical talk that went a little over my head, hence the small notes

  • Testing double is like a stunt double
    • Saves money as your double replaces "expensive" calls
  • Terms
    • Dummy object - no methods
    • Test stub - object & methods
    • Test Spy - keeps track of which methods and properties called
    • Mock object - expect X to be called Y times
  • Test stub
    • Looks like a real object
    • Can be configured to return a value or throw exception
  • Mock object
    • Looks like a real object
    • Can accept messages
    • Test communications between objects
  • PHP makes mock objects dynamic
    • E.g. with createStub
  • PHPStan reads certain comments (e.g. @template) to better testing - also this helps with IDE hinting
  • Never mock what isn't yours

Links

Time Management for Developers - Rachel Foucard

T3DD Schedule Link / Slides (tbc) / Video (tbc)

  • Perception
    • Rachel told the story of rocks in a jar
    • Detect when something is good enough
      • You can add more value by focusing on more features, rather than whittling something to perfection
    • Past / Present / Future -> Not Now / Now / Not Now -> Done / In Progress / To Do
    • Days are routine which can help you synchronise with others but sometimes feel like they are running away from you
    • Social Acceleration
      • Technical (e.g. sending an email instead of a letter)
      • Social Change (e.g. postal workers losing jobs because of email)
      • Pace of Life (e.g. instantaneous communication instead of waiting for a letter)
  • Synchronisation
    • Meetings!
    • Use the same tools to plan your personal life as you do your professional life
  • Tools
    • Mandatory tools are ones your stakeholders/company use
    • Personal tools are the ones you prefer
    • The best tool is the one that you use regularly
    • If a task is less than 5 minutes, don't add it to your To Do, do it now
    • Personal To do lists should be: Update -> Watch -> Update -> Watch (etc)
      • Should look at your to do list at least 3 times a day
    • Synchronous tools (e.g. PM tools) should follow the same pattern but be looked at daily
  • Anticipation
    • Executing the task before time
    • Forseeing tasks
    • Emergency vs Priority is Fast vs First
    • Never plan 5 days of work a week - it doesn't exist
    • Track your time - not just for your company but for you, see where your time goes and it gives you better estimations

Other Talk Resources

Read time: 5 mins

Tags:

]]>