Skip to main content
11
results
Andrew Fletcher
Goal: Restrict content access through username and password entry on an Nginx server. 1: Apache Utilities Package First, update your server’s package index: sudo apt updateCheck if the utilities package exists in your environment by executing the command dpkg --get-selections | grep apacheResponse: apache2-utils install libapache-pom-java installSo it exists.  But what do you do if it doesn't exist? How come I need to install apache-utils?  To restrict access you will be using...
Andrew Fletcher
Goal: Restrict content access through username and password entry. 1: Apache...
Andrew Fletcher
Working on a decoupled React / Drupal 9 site. Aim: Adjust the output of curated...
Andrew Fletcher
How to make React calls on a Drupal 9 backend site using the search...
Andrew Fletcher
This code is from Drupal 9 back-end for a React front-end via REST API. Working...
Andrew Fletcher
As a web developer, you will most likely need to run local copies of a bunch of different web sites. Regularly switching between several sites daily.  Sometime Drupal, other Laravel and whatever frameworks that are in your toolkit.   Install Docker and Lando Do you have Docker installed?  No, then go to the Docker Desktop page and download. Now let's look in to Lando.  Have a look at the Lando releases on GitHub to download the latest package for your OS. Run the installer. I...
Andrew Fletcher
Every project has to kick off somewhere.  Yep well that's a no brainer....
Andrew Fletcher
I installed Lando 3.6.2 and Laravel 9.  When I visit the web page, I...
Andrew Fletcher
Do you want set up a CI/CD process using GitHub Actions? This is a walk-through...
Andrew Fletcher
If like me you tried the command php artisanHowever, you had the following...
Andrew Fletcher
Have you noticed through your Google account that there are items being indexed that shouldn't have been?   One for me was taxonomy terms.  Not important to be followed. Being indexed on my sitemap (do I just erase those entries on the XML file?) Being crawled (I am guessing this is with a robots.txt file though I have never created one before) Being viewed (stumbled upon) -> if this even possible to block?   Resolve these quickly by simply adding disallow in robots.txt...