This guide covers deploying LogScope in various environments.
- Java 17 or higher
- Docker and Docker Compose (for containerized deployment)
- AWS credentials with CloudWatch Logs read access
The easiest way to deploy LogScope is using Docker Compose:
# Clone the repository
git clone https://github.com/yourorg/logscope.git
cd logscope
# Configure AWS credentials
export AWS_ACCESS_KEY_ID=your-access-key
export AWS_SECRET_ACCESS_KEY=your-secret-key
export AWS_REGION=us-east-1
# Start the application
docker-compose up -dThe dashboard will be available at http://localhost:8080.
| Variable | Description | Default |
|---|---|---|
| AWS_ACCESS_KEY_ID | AWS access key | - |
| AWS_SECRET_ACCESS_KEY | AWS secret key | - |
| AWS_REGION | AWS region | us-east-1 |
| AWS_PROFILE | AWS profile (alternative to keys) | - |
| LOGSCOPE_DB_PATH | SQLite database path | ./data/logscope.db |
| LOGSCOPE_PORT | Application port | 8080 |
| LOGSCOPE_ADMIN_PORT | Admin port | 8081 |
Minimum required IAM permissions:
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": [
"logs:DescribeLogGroups",
"logs:DescribeLogStreams",
"logs:GetLogEvents",
"logs:FilterLogEvents"
],
"Resource": "*"
}
]
}For production, restrict the Resource to specific log groups:
"Resource": [
"arn:aws:logs:us-east-1:123456789:log-group:/aws/lambda/*",
"arn:aws:logs:us-east-1:123456789:log-group:/ecs/*"
]server:
applicationConnectors:
- type: http
port: ${LOGSCOPE_PORT:-8080}
adminConnectors:
- type: http
port: ${LOGSCOPE_ADMIN_PORT:-8081}
aws:
region: ${AWS_REGION:-us-east-1}
profile: ${AWS_PROFILE:-}
database:
driverClass: org.sqlite.JDBC
url: jdbc:sqlite:${LOGSCOPE_DB_PATH:-./data/logscope.db}
patternDetection:
enabled: true
analysisIntervalMinutes: 5
alerting:
enabled: true
defaultThresholds:
- name: "Critical Errors"
patternType: ERROR
threshold: 10
windowMinutes: 5
severity: CRITICALSee docker-compose.yml for the full configuration.
docker-compose up -d# Build the JAR
mvn clean package
# Run
java -jar target/logscope-1.0-SNAPSHOT.jar server config.ymlExample Kubernetes deployment:
apiVersion: apps/v1
kind: Deployment
metadata:
name: logscope
spec:
replicas: 1
selector:
matchLabels:
app: logscope
template:
metadata:
labels:
app: logscope
spec:
containers:
- name: logscope
image: logscope:latest
ports:
- containerPort: 8080
env:
- name: AWS_REGION
value: "us-east-1"
volumeMounts:
- name: data
mountPath: /app/data
- name: aws-credentials
mountPath: /root/.aws
readOnly: true
volumes:
- name: data
persistentVolumeClaim:
claimName: logscope-data
- name: aws-credentials
secret:
secretName: aws-credentialsFor ECS deployments, use IAM roles instead of access keys:
{
"taskRoleArn": "arn:aws:iam::123456789:role/logscope-task-role",
"containerDefinitions": [
{
"name": "logscope",
"image": "123456789.dkr.ecr.us-east-1.amazonaws.com/logscope:latest",
"portMappings": [
{
"containerPort": 8080,
"hostPort": 8080
}
],
"mountPoints": [
{
"sourceVolume": "data",
"containerPath": "/app/data"
}
]
}
]
}LogScope uses SQLite for local caching. Ensure the data directory is persisted:
- Docker: Mount a volume to
/app/data - Kubernetes: Use a PersistentVolumeClaim
- ECS: Use EFS or EBS volumes
- Application health:
GET /healthcheck - Admin health:
GET :8081/healthcheck - Ping:
GET :8081/ping
Dropwizard metrics are available at:
GET :8081/metrics- JSON metricsGET :8081/metrics/prometheus- Prometheus format
1. AWS Credentials Not Found
com.amazonaws.SdkClientException: Unable to load AWS credentials
Solution: Ensure AWS credentials are properly configured via environment variables, AWS profile, or IAM role.
2. Database Locked
org.sqlite.SQLiteException: [SQLITE_BUSY] The database file is locked
Solution: Ensure only one instance is accessing the SQLite database, or use WAL mode.
3. Memory Issues
For large log volumes, increase JVM heap:
java -Xmx2g -jar logscope.jar server config.yml- Network Security: Deploy behind a reverse proxy with TLS
- AWS Credentials: Use IAM roles when possible, never commit credentials
- Access Control: Implement authentication if exposing to untrusted networks
- Data Sensitivity: Be aware that cached logs may contain sensitive data