Compare commits

..

4 Commits

Author SHA1 Message Date
55778b8fb8 Merge pull request 'added api key auth docs' (#7) from doc/api-key-auth into main
All checks were successful
Build, Test & Publish / Build (push) Successful in 19s
Build, Test & Publish / Build and Publish Container Image (push) Successful in 1m16s
Build, Test & Publish / Deploy to Infrastructure (push) Successful in 58s
Reviewed-on: #7
2025-02-18 08:52:48 +11:00
a19c3bcff4 added api key auth docs
All checks were successful
Build, Test & Publish / Build (pull_request) Successful in 56s
Build, Test & Publish / Build and Publish Container Image (pull_request) Has been skipped
Build, Test & Publish / Deploy to Infrastructure (pull_request) Has been skipped
2025-02-18 08:52:04 +11:00
af884061b6 Merge pull request 'added docker items for db' (#6) from feat/doc-local-dbs into main
All checks were successful
Build, Test & Publish / Build (push) Successful in 19s
Build, Test & Publish / Build and Publish Container Image (push) Successful in 1m1s
Build, Test & Publish / Deploy to Infrastructure (push) Successful in 1m0s
Reviewed-on: #6
2025-02-12 10:52:23 +11:00
df8f43ba5a added docker items for db
All checks were successful
Build, Test & Publish / Build (pull_request) Successful in 24s
Build, Test & Publish / Build and Publish Container Image (pull_request) Has been skipped
Build, Test & Publish / Deploy to Infrastructure (pull_request) Has been skipped
2025-02-12 10:50:06 +11:00
6 changed files with 407 additions and 2 deletions

View File

@ -36,6 +36,7 @@ export default defineConfig({
{ text: 'Google Sign in without Identity', link: '/dotnet/google-sign-in-without-identity' }, { text: 'Google Sign in without Identity', link: '/dotnet/google-sign-in-without-identity' },
{ text: 'Service Testing', link: '/dotnet/service-testing' }, { text: 'Service Testing', link: '/dotnet/service-testing' },
{ text: 'Controller Testing', link: '/dotnet/controller-testing' }, { text: 'Controller Testing', link: '/dotnet/controller-testing' },
{ text: 'API Key Authentication', link: '/dotnet/api-key-auth'}
] ]
}, },
{ {
@ -71,6 +72,8 @@ export default defineConfig({
collapsed: true, collapsed: true,
items: [ items: [
{ text: 'Docker Exec', link: '/docker/exec-into-container' }, { text: 'Docker Exec', link: '/docker/exec-into-container' },
{ text: 'Local DB (MSSQL)', link: '/docker/local-db-mssql' },
{ text: 'Local DB (PostgreSQL)', link: '/docker/local-db-pg' },
] ]
}, },
{ {

View File

@ -1,3 +1,5 @@
# Docker Snippets and Musings # Docker Snippets and Musings
#### [Exec Into Container](./exec-into-container.md) #### [Exec Into Container](./exec-into-container.md)
#### [Local Database With Scripts (MSSQL)](./local-db-mssql.md)
#### [Local Database With Scripts (PostgreSQL)](./local-db-pg.md)

View File

@ -0,0 +1,174 @@
# Local Database With Scripts (MSSQL)
When developing apps locally it can be really useful to have a dockerised database unique to the application.
Often rather than just running a pre-built image, you'll want to run a database with some initial data, tables, or a schema.
For this purpose we can create our own image that extends the base image and adds our own scripts.
## Setup
For most applications the directory structure will look something like this:
```
database/
Dockerfile
scripts/
01-create-database.sql
02-create-tables.sql
03-seed-data.sql
development/
compose.yml
src/
...
tests/
...
```
### Dockerfile
Create a dockerfile in the `database/` directory:
::: code-group
```dockerfile [Dockerfile]
FROM mcr.microsoft.com/mssql/server:2022-latest
# Set the SQL Server environment variables
ENV ACCEPT_EULA="Y"
ENV SA_PASSWORD="Password123"
# Setup port
EXPOSE 1433
# Create a temp directory
RUN mkdir -p /tmp/init
# Copy all the scripts into the container
COPY ./scripts/ /tmp/init
ENTRYPOINT [ "/tmp/init/entrypoint.sh" ]
```
:::
::: danger
As this is a local development database, we're using the `sa` user with a simple password. **Do not use this in production**.
:::
### Scripts
Create the scripts in the `database/scripts/` directory:
::: code-group
```bash [entrypoint.sh]
#!/bin/bash
# Run the sql scripts and start sql server
/tmp/init/run-scripts.sh & /opt/mssql/bin/sqlservr
```
:::
::: code-group
```bash [run-scripts.sh]
#!/bin/bash
# Wait for the mssql database to be ready
while ! /opt/mssql-tools18/bin/sqlcmd -S localhost -U sa -P $SA_PASSWORD -C -Q "SELECT 1" > /dev/null; do
sleep 1
done
echo "SQL Server is up and running"
# Check if the setup has already been executed
SETUP_DONE=$(/opt/mssql-tools18/bin/sqlcmd -S localhost -U sa -P "$SA_PASSWORD" -C -Q "IF EXISTS (SELECT 1 FROM master.sys.tables WHERE name = 'setup_marker' AND schema_id = SCHEMA_ID('dbo')) SELECT 1 ELSE SELECT 0" -h -1 -W -r 1 | grep -oE '^[0-9]+' | tr -d '[:space:]')
if [[ "$SETUP_DONE" == "1" ]]; then
echo "Setup has already been completed. Skipping initialization."
exit 0
else
echo "Setup has not been completed. Running initialization."
fi
# Run all scripts in the scripts folder
for entry in /tmp/init/*.sql;
do
echo "Running script $entry"
/opt/mssql-tools18/bin/sqlcmd -S localhost -U sa -P $SA_PASSWORD -C -i $entry
done
# Create a marker table to indicate setup completion
/opt/mssql-tools18/bin/sqlcmd -S localhost -U sa -P $SA_PASSWORD -C -Q "CREATE TABLE master.dbo.setup_marker (id INT PRIMARY KEY IDENTITY, created_at DATETIME DEFAULT GETDATE())"
echo "All scripts have been run"
```
:::
The above script waits for the database to be ready, then checks if the setup has already run. If not it will run all the scripts in the `scripts/` directory and create a marker table to indicate that the setup has been completed.
Create any scripts that you need in the `database/scripts/` directory.
::: tip
See below for an example of the scripts you might want to run.
:::
::: code-group
```sql [01-create-database.sql]
CREATE DATABASE MyDatabase
```
```sql [02-create-tables.sql]
USE MyDatabase
CREATE TABLE MyTable (
id INT PRIMARY KEY,
name NVARCHAR(50)
)
```
```sql [03-seed-data.sql]
USE MyDatabase
INSERT INTO MyTable (id, name) VALUES (1, 'Alice')
INSERT INTO MyTable (id, name) VALUES (2, 'Bob')
```
```sql [04-create-user.sql]
USE MyDatabase
CREATE LOGIN MyUser WITH PASSWORD = 'MyPassword'
CREATE USER MyUser FOR LOGIN MyUser
ALTER ROLE db_owner ADD MEMBER MyUser;
```
:::
## Compose
Lastly we need to create a `docker-compose.yml` file in the `development/` directory:
::: code-group
```yaml [compose.yml]
services:
db:
build:
context: ../database
dockerfile: Dockerfile
volumes:
- db-data:/var/opt/mssql
ports:
- "1433:1433"
volumes:
db-data:
```

View File

@ -0,0 +1,89 @@
# Local Database With Scripts (PostgreSQL)
When developing apps locally it can be really useful to have a dockerised database unique to the application.
Often rather than just running a pre-built image, you'll want to run a database with some initial data, tables, or a schema.
For this purpose we can create our own image that extends the base image and adds our own scripts.
## Setup
For most applications the directory structure will look something like this:
```
database/
Dockerfile
scripts/
01-create-tables.sql
development/
compose.yml
src/
...
tests/
...
```
### Dockerfile
Create a dockerfile in the `database/` directory:
::: code-group
```dockerfile [Dockerfile]
FROM postgres:17
# Setup the postgres environment variables
ENV POSTGRES_USER=myuser
ENV POSTGRES_PASSWORD=mypassword
ENV POSTGRES_DB=mydatabase
# Setup port
EXPOSE 5432
# Copy all the scripts into the container
COPY ./scripts /docker-entrypoint-initdb.d/
```
:::
::: danger
As this is a local development database, we're using the a simple username and password. **Do not use this in production**.
:::
### Scripts
Create any scripts you need in the `database/scripts/` directory. PostgreSQL will run these scripts in alphabetical order against the database specified in the `POSTGRES_DB` environment variable.
::: code-group
```sql [01-create-tables.sql]
CREATE TABLE MyTable (
Id INT NOT NULL PRIMARY KEY,
Name VARCHAR(50) NOT NULL
);
```
:::
## Compose
Lastly we need to create a `docker-compose.yml` file in the `development/` directory:
::: code-group
```yaml [compose.yml]
services:
db:
build:
context: ../database
dockerfile: Dockerfile
volumes:
- db-data:/var/lib/postgresql/data
ports:
- "5432:5432"
volumes:
db-data:
```

136
docs/dotnet/api-key-auth.md Normal file
View File

@ -0,0 +1,136 @@
# API Key Auth
Simple API Key authentication is a great option when building public facing APIs without strict security requirements, but you would rather not leave open. Think syncs, long running jobs or other non-critical operations.
## Configuration
This example stores the ApiKey in the `appsettings.json` file. You can also store it in a database, environment variable, or any other configuration source.
::: code-group
```json[appsettings.json]
{
"Logging": {
"LogLevel": {
"Default": "Information",
"Microsoft.AspNetCore": "Warning"
}
},
"AllowedHosts": "*",
"ApiKey": "ThisIsMySecretKey",
// ...
}
```
:::
## Filter
The logic for the api key authentication is a simple Authorization filter. It checks the `ApiKey` header against the configured value.
Start by storing the header name in a constants file or similar:
::: code-group
```csharp[Constants.cs]
namespace ApiKeyAuthDemo.Core
{
public static class Constants
{
public const string API_KEY_HEADER_NAME = "X-API-KEY";
}
}
```
:::
Then create the filter:
::: code-group
```csharp[ApiKeyAuthorizeAttribute.cs]
using Microsoft.AspNetCore.Mvc;
using Microsoft.AspNetCore.Mvc.Filters;
namespace ApiKeyAuthDemo.Core.Filters
{
[AttributeUsage(AttributeTargets.Class | AttributeTargets.Method)]
public class ApiKeyAuthorizeAttribute() : Attribute, IAuthorizationFilter
{
public void OnAuthorization(AuthorizationFilterContext context)
{
// Get the API key from the request headers
string? apiKeyValue = context.HttpContext.Request.Headers[Constants.API_KEY_HEADER_NAME];
// Get the API key from the configuration
IConfiguration configuration = context.HttpContext.RequestServices.GetRequiredService<IConfiguration>();
string? apiKey = configuration.GetValue<string>("ApiKey");
// Check if the API key is valid and set
if (apiKeyValue == null || apiKeyValue != apiKey)
{
context.Result = new UnauthorizedResult();
}
}
}
}
```
:::
## Usage
See below for example usage (on the second GET method):
::: code-group
```csharp[WeatherForecastController.cs]
using ApiKeyAuthDemo.Core.Filters;
using Microsoft.AspNetCore.Mvc;
namespace ApiKeyAuthDemo.Controllers;
[ApiController]
[Route("[controller]")]
public class WeatherForecastController : ControllerBase
{
private static readonly string[] Summaries = new[]
{
"Freezing", "Bracing", "Chilly", "Cool", "Mild", "Warm", "Balmy", "Hot", "Sweltering", "Scorching"
};
private readonly ILogger<WeatherForecastController> _logger;
public WeatherForecastController(ILogger<WeatherForecastController> logger)
{
_logger = logger;
}
[HttpGet]
public IEnumerable<WeatherForecast> Get()
{
return Enumerable.Range(1, 5).Select(index => new WeatherForecast
{
Date = DateOnly.FromDateTime(DateTime.Now.AddDays(index)),
TemperatureC = Random.Shared.Next(-20, 55),
Summary = Summaries[Random.Shared.Next(Summaries.Length)]
})
.ToArray();
}
[ApiKeyAuthorize]
[HttpGet("auth")]
public IEnumerable<WeatherForecast> GetAuth()
{
return Enumerable.Range(1, 5).Select(index => new WeatherForecast
{
Date = DateOnly.FromDateTime(DateTime.Now.AddDays(index)),
TemperatureC = Random.Shared.Next(-20, 55),
Summary = Summaries[Random.Shared.Next(Summaries.Length)]
})
.ToArray();
}
}
```
:::

View File

@ -12,3 +12,4 @@
#### [Google Sign in Without Identity](./google-sign-in-without-identity.md) #### [Google Sign in Without Identity](./google-sign-in-without-identity.md)
#### [Service Testing](./service-testing.md) #### [Service Testing](./service-testing.md)
#### [Controller Testing](./controller-testing.md) #### [Controller Testing](./controller-testing.md)
#### [API Key Authentication](./api-key-auth.md)