๐Ÿ” SSL: The Security Foundation of the Modern Web

๐Ÿ‘‹ Introduction

In today’s digital landscape, SSL (Secure Sockets Layer) and its successor TLS (Transport Layer Security) form the backbone of internet security. Every time you see that reassuring padlock icon in your browser’s address bar, you’re witnessing SSL/TLS in action. But what exactly is SSL, how does it work, and why has it become so crucial for every website owner? Let’s dive deep into the world of SSL certificates and explore how they’ve transformed the web.

โš™๏ธ What is SSL and How Does It Work?

SSL (Secure Sockets Layer) is a cryptographic protocol designed to provide secure communication over a computer network. While SSL has been largely replaced by TLS (Transport Layer Security), the term “SSL” is still commonly used to refer to both protocols.

The SSL Handshake Process

When you visit a website with SSL enabled, a complex but lightning-fast process occurs:

  1. Client Hello: Your browser sends a “hello” message to the server, including supported encryption methods
  2. Server Hello: The server responds with its chosen encryption method and sends its SSL certificate
  3. Certificate Verification: Your browser verifies the certificate’s authenticity against trusted Certificate Authorities (CAs)
  4. Key Exchange: Both parties establish a shared secret key for encryption
  5. Secure Connection: All subsequent communication is encrypted using the established key

Encryption Types

SSL uses two types of encryption:

  • Symmetric Encryption: Fast encryption using the same key for both encryption and decryption
  • Asymmetric Encryption: Uses a pair of keys (public and private) for initial handshake and key exchange

๐ŸŒ How SSL Transformed the Web

Before SSL: The Wild West of the Internet

In the early days of the web, all data transmitted between browsers and servers was sent in plain text. This meant:

  • No Privacy: Anyone intercepting traffic could read sensitive information
  • No Integrity: Data could be modified without detection
  • No Authentication: No way to verify you were communicating with the intended server

The SSL Revolution

SSL implementation brought three fundamental security principles to the web:

  1. Confidentiality: Data encryption ensures only intended recipients can read the information
  2. Integrity: Cryptographic hashes detect any tampering with data during transmission
  3. Authentication: Digital certificates verify the identity of websites

Impact on E-commerce and Online Services

SSL made modern e-commerce possible by:

  • Enabling secure credit card transactions
  • Building user trust in online services
  • Protecting sensitive personal information
  • Facilitating the growth of online banking and financial services

๐Ÿ“œ SSL Certificates: Your Digital Identity Card

An SSL certificate is a digital document that:

  • Proves the identity of a website
  • Contains the website’s public key
  • Is digitally signed by a trusted Certificate Authority (CA)

Types of SSL Certificates

1. Domain Validated (DV) Certificates

  • Validation: Only verifies domain ownership
  • Trust Level: Basic
  • Use Case: Personal websites, blogs
  • Issuance Time: Minutes to hours

2. Organization Validated (OV) Certificates

  • Validation: Verifies domain ownership and organization details
  • Trust Level: Medium
  • Use Case: Business websites
  • Issuance Time: 1-3 days

3. Extended Validation (EV) Certificates

  • Validation: Rigorous verification of organization’s legal existence
  • Trust Level: Highest
  • Use Case: E-commerce, banking, high-security sites
  • Issuance Time: 1-2 weeks

Certificate Coverage Options

  • Single Domain: Protects one specific domain (e.g., http://www.example.com)
  • Multi-Domain (SAN): Protects multiple different domains
  • Wildcard: Protects a domain and all its subdomains (e.g., *.example.com)

๐Ÿ› ๏ธ How to Get and Implement SSL Certificates

Step 1: Choose Your SSL Provider

Select from various Certificate Authorities based on your needs:

  • Free Options: Let’s Encrypt, SSL.com Free
  • Commercial Providers: DigiCert, GlobalSign, Sectigo, GoDaddy

Step 2: Generate a Certificate Signing Request (CSR)

# Example using OpenSSL
openssl req -new -newkey rsa:2048 -nodes -keyout yourdomain.key -out yourdomain.csr

Step 3: Validate Domain Ownership

Certificate Authorities typically offer three validation methods:

  • Email Validation: Receive validation email at admin@yourdomain.com
  • DNS Validation: Add a specific TXT record to your DNS
  • HTTP File Upload: Upload a verification file to your website

Step 4: Install the Certificate

Installation varies by server type:

Apache

<VirtualHost *:443>
    ServerName www.yourdomain.com
    SSLEngine on
    SSLCertificateFile /path/to/yourdomain.crt
    SSLCertificateKeyFile /path/to/yourdomain.key
    SSLCertificateChainFile /path/to/intermediate.crt
</VirtualHost>

Nginx

server {
    listen 443 ssl;
    server_name www.yourdomain.com;

    ssl_certificate /path/to/yourdomain.crt;
    ssl_certificate_key /path/to/yourdomain.key;
    ssl_protocols TLSv1.2 TLSv1.3;
}

Step 5: Configure HTTP to HTTPS Redirect

# Apache .htaccess
RewriteEngine On
RewriteCond %{HTTPS} off
RewriteRule ^(.*)$ https://%{HTTP_HOST}%{REQUEST_URI} [L,R=301]

โš ๏ธ The Cost of Not Having SSL

SEO Impact

  • Google Ranking Factor: HTTPS is a confirmed ranking signal
  • Browser Warnings: Modern browsers flag non-HTTPS sites as “Not Secure”
  • User Trust: Visitors are likely to leave unsecured sites

Security Risks

  • Data Interception: Sensitive information transmitted in plain text
  • Man-in-the-Middle Attacks: Attackers can intercept and modify communications
  • Session Hijacking: User sessions can be stolen on unsecured networks

Business Consequences

  • Lost Revenue: Users abandon transactions on insecure sites
  • Compliance Issues: Many regulations require encryption (GDPR, PCI DSS)
  • Reputation Damage: Security breaches can destroy customer trust

๐Ÿ’ฐ SSL Providers: Free vs. Paid Services

Free SSL Providers

Let’s Encrypt

  • Cost: Completely free
  • Validity: 90 days (auto-renewable)
  • Support: Domain and wildcard certificates
  • Automation: Excellent with tools like Certbot
  • Limitation: DV certificates only
# Install Let's Encrypt certificate with Certbot
sudo certbot --apache -d yourdomain.com -d www.yourdomain.com

SSL.com Free

  • Cost: Free for basic DV certificates
  • Validity: 90 days
  • Features: Basic domain validation

Cloudflare SSL

  • Cost: Free with Cloudflare service
  • Features: Universal SSL for all domains
  • Limitation: Requires using Cloudflare as CDN/proxy

Commercial SSL Providers

DigiCert

  • Reputation: Industry leader with highest trust
  • Features: EV, OV, DV certificates with extensive support
  • Price Range: $175-$595+ annually
  • Benefits: 24/7 support, warranty, advanced features

GlobalSign

  • Strengths: Enterprise-focused solutions
  • Features: Complete certificate lifecycle management
  • Price Range: $149-$649+ annually

Sectigo (formerly Comodo)

  • Position: Largest commercial CA by volume
  • Features: Wide range of certificate types
  • Price Range: $89-$299+ annually

GoDaddy

  • Advantage: Integration with hosting services
  • Features: Easy installation for beginners
  • Price Range: $69-$199+ annually

Cloud Provider SSL Solutions

AWS Certificate Manager (ACM)

  • Cost: Free for AWS services
  • Integration: Seamless with CloudFront, Load Balancers, API Gateway
  • Automation: Automatic renewal and deployment
  • Limitation: Only works within AWS ecosystem
# Request certificate via AWS CLI
aws acm request-certificate \
    --domain-name yourdomain.com \
    --subject-alternative-names www.yourdomain.com \
    --validation-method DNS

Google Trust Services

  • Integration: Works with Google Cloud Platform
  • Features: Managed certificates for Google Cloud Load Balancer
  • Cost: Free for Google Cloud services
  • Automation: Automatic provisioning and renewal

Azure SSL

  • Service: App Service Certificates
  • Integration: Native Azure integration
  • Features: Wildcard and standard certificates available

โœ… Best Practices for SSL Implementation

Security Configuration

  1. Use Strong Ciphers: Disable weak encryption algorithms
  2. Enable HSTS: Force HTTPS connections
  3. Configure Perfect Forward Secrecy: Protect past communications
  4. Regular Updates: Keep SSL/TLS libraries updated

Monitoring and Maintenance

  • Certificate Expiration Monitoring: Set up alerts before expiration
  • Security Scanning: Regular vulnerability assessments
  • Performance Monitoring: Track SSL handshake performance

Common Pitfalls to Avoid

  • Mixed Content: Ensure all resources load over HTTPS
  • Certificate Chain Issues: Include intermediate certificates
  • Weak Configurations: Avoid outdated protocols and ciphers

๐Ÿš€ The Future of SSL/TLS

TLS 1.3 Adoption

  • Faster handshakes
  • Improved security
  • Better performance

Certificate Transparency

  • Public logs of all certificates
  • Enhanced security monitoring
  • Improved detection of unauthorized certificates

Automated Certificate Management

  • ACME protocol standardization
  • Integration with CI/CD pipelines
  • Infrastructure as Code compatibility

๐ŸŽฏ Conclusion

SSL/TLS has evolved from a nice-to-have security feature to an absolute necessity for any serious web presence. Whether you choose a free solution like Let’s Encrypt for basic protection or invest in enterprise-grade certificates from providers like DigiCert, implementing SSL is no longer optionalโ€”it’s essential.

The transformation from an insecure web to today’s encrypted-by-default internet represents one of the most significant security improvements in computing history. As we move forward, SSL/TLS will continue to evolve, becoming faster, more secure, and easier to implement.

For website owners, the message is clear: implement SSL today, keep your certificates updated, and follow security best practices. Your users’ trust and your website’s success depend on it.


Remember: Security is not a destination but a journey. Stay informed about the latest SSL/TLS developments and regularly review your security configurations to ensure optimal protection for your users and your business.

Happy Web coding! ๐Ÿš€

Setup ๐Ÿ› ย Rails 8 App โ€“ Part 10: PostgreSQL Into The Action

For a Ruby on Rails 8 application, the choice of database depends on your specific needs, but hereโ€™s a breakdown of the best options and when to use each:

PostgreSQL (Highly Recommended)

Best overall choice for most Rails apps.

Why:

  • First-class support in Rails.
  • Advanced features like full-text search, JSONB support, CTEs, window functions.
  • Strong consistency and reliability.
  • Scales well vertically and horizontally (with tools like Citus).
  • Used by: GitHub, Discourse, Basecamp, Shopify.

Use if:

  • Youโ€™re building a standard Rails web app or API.
  • You need advanced query features or are handling complex data types (e.g., JSON).

SQLite (For development/testing only)

  • Lightweight, file-based.
  • Fast and easy to set up.
  • But not recommended for production.

Use if:

  • Youโ€™re building a quick prototype or local dev/testing app.
  • NOT for multi-user production environments.

MySQL / MariaDB

  • Also supported by Rails.
  • Can work fine for simpler applications.
  • Lacks some advanced features (like robust JSON support or full Postgres-style indexing).
  • Not the default in many modern Rails setups.

Use if:

  • Your team already has MySQL infrastructure or legacy systems.
  • You need horizontal scaling with Galera Cluster or similar setups.

Others (NoSQL like MongoDB, Redis, etc.)

  • Use Redis for caching and background job data (not as primary DB).
  • Use MongoDB or other NoSQL only if your data model really demands it (e.g., unstructured documents, event sourcing).

Recommendation Summary:

Use CaseRecommended DB
Production web/API appPostgreSQL
Dev/prototyping/local testingSQLite
Legacy systems/MySQL infrastructureMySQL/MariaDB
Background jobs/cachingRedis
Special needs (e.g., documents)MongoDB (with caution)

If you’re starting fresh or building something scalable and modern with Rails 8, go with PostgreSQL.

Letโ€™s break that down:

๐Ÿ’ฌ What does “robust JSON support” mean?

PostgreSQL supports a special column type: json and jsonb, which lets you store structured JSON data directly in your database โ€” like hashes or objects.

Why it matters:

  • You can store dynamic data without needing to change your schema.
  • You can query inside the JSON using SQL (->, ->>, @>, etc.).
  • You can index parts of the JSON โ€” for speed.

๐Ÿ”ง Example:

You have a products table with a specs column that holds tech specs in JSON:

specs = {
  "color": "black",
  "brand": "Libas",
  "dimensions": {"chest": "34", "waist": "30", "shoulder": "13.5"}
}

You can query like:

SELECT * FROM products WHERE specs->>'color' = 'black';

Or check if the JSON contains a value:

SELECT * FROM products WHERE specs @> '{"brand": "Libas"}';

You can even index specs->>'color' to make these queries fast.


๐Ÿ’ฌ What does “full Postgres-style indexing” mean?

PostgreSQL supports a wide variety of powerful indexing options, which improve query performance and flexibility.

โš™๏ธ Types of Indexes PostgreSQL supports:

Index TypeUse Case
B-TreeDefault; used for most equality and range searches
GIN (Generalized Inverted Index)Fast indexing for JSON, arrays, full-text search
Partial IndexesIndex only part of the data (e.g., WHERE active = true)
Expression IndexesIndex a function or expression (e.g., LOWER(email))
Covering Indexes (INCLUDE)Fetch data directly from the index, avoiding table reads
  • B-Tree Indexes: B-tree indexes are more suitable for single-value columns.
  • When to Use GIN Indexes: When you frequently search for specific elements within arrays, JSON documents, or other composite data types.
  • Example for GIN Indexes: Imagine you have a table with a JSONB column containing document metadata. A GIN index on this column would allow you to quickly find all documents that have a specific author or belong to a particular category. 

Why does this matter for our shopping app?

  • We can store and filter products with dynamic specs (e.g., kurtas, shorts, pants) without new columns.
  • Full-text search on product names/descriptions.
  • Fast filters: color = 'red' AND brand = 'Libas' even if those are stored in JSON.
  • Index custom expressions like LOWER(email) for case-insensitive login.

๐Ÿ’ฌ What are Common Table Expressions (CTEs)?

CTEs are temporary result sets you can reference within a SQL query โ€” like defining a mini subquery that makes complex SQL easier to read and write.

WITH recent_orders AS (
  SELECT * FROM orders WHERE created_at > NOW() - INTERVAL '7 days'
)
SELECT * FROM recent_orders WHERE total > 100;

  • Breaking complex queries into readable parts.
  • Re-using result sets without repeating subqueries.
In Rails (via with from gems like scenic or with_cte):
Order
  .with(recent_orders: Order.where('created_at > ?', 7.days.ago))
  .from('recent_orders')
  .where('total > ?', 100)

๐Ÿ’ฌ What are Window Functions?

Window functions perform calculations across rows related to the current row โ€” unlike aggregate functions, they donโ€™t group results into one row.

๐Ÿ”ง Example: Rank users by their score within each team:
SELECT
  user_id,
  team_id,
  score,
  RANK() OVER (PARTITION BY team_id ORDER BY score DESC) AS rank
FROM users;
Use cases:
  • Ranking rows (like leaderboards).
  • Running totals or moving averages.
  • Calculating differences between rows (e.g. โ€œHow much did this order increase from the last?โ€).
๐Ÿ›ค In Rails:

Window functions are available through raw SQL or Arel. Here’s a basic example:

User
  .select("user_id, team_id, score, RANK() OVER (PARTITION BY team_id ORDER BY score DESC) AS rank")

CTEs and Window functions are fully supported in PostgreSQL, making it the go-to DB for any Rails 8 app that needs advanced querying.

JSONB Support

JSONB stands for “JSON Binary” and is a binary representation of JSON data that allows for efficient storage and retrieval of complex data structures.

This can be useful when you have data that doesn’t fit neatly into traditional relational database tables, such as nested or variable-length data structures.

Absolutely โ€” storing JSON in a relational database (like PostgreSQL) can be super powerful when used wisely. It gives you schema flexibility without abandoning the structure and power of SQL. Here are real-world use cases for using JSON columns in relational databases:

Here are real-world use cases for using JSON columns in relational databases:

๐Ÿ”ง 1. Flexible Metadata / Extra Attributes

Let users store arbitrary attributes that don’t require schema changes every time.

Use case: Product variants, custom fields

t.jsonb :metadata

{
  "color": "red",
  "size": "XL",
  "material": "cotton"
}

=> Good when:

  • You can’t predict all the attributes users will need.
  • You donโ€™t want to create dozens of nullable columns.

๐ŸŽ›๏ธ 2. Storing Settings or Preferences

User or app settings that vary a lot.

Use case: Notification preferences, UI layout, feature toggles

{
  "email": true,
  "sms": false,
  "theme": "dark"
}

=> Easy to store and retrieve as a blob without complex joins.

๐ŸŒ 3. API Response Caching

Store external API responses for caching or auditing.

Use case: Storing Stripe, GitHub, or weather API responses.

t.jsonb :api_response

=> Avoids having to map every response field into a column.

๐Ÿ“ฆ 4. Storing Logs or Events

Use case: Audit trails, system logs, user events

{
  "action": "login",
  "timestamp": "2025-04-18T10:15:00Z",
  "ip": "123.45.67.89"
}

=> Great for capturing varied data over time without a rigid schema.

๐Ÿ“Š 6. Embedded Mini-Structures

Use case: A form builder app storing user-created forms and fields.

{
  "fields": [
    { "type": "text", "label": "Name", "required": true },
    { "type": "email", "label": "Email", "required": false }
  ]
}

=> When each row can have nested, structured data โ€” almost like a mini-document.

๐Ÿ•น๏ธ 7. Device or Browser Info (User Agents)

Use case: Analytics, device fingerprinting

{
  "browser": "Safari",
  "os": "macOS",
  "version": "17.3"
}

=> You donโ€™t need to normalize or query this often โ€” perfect for JSON.


JSON vs JSONB in PostgreSQL

Use jsonb over json unless you need to preserve order or whitespace.

  • jsonb is binary format โ†’ faster and indexable
  • You can do fancy stuff like:
SELECT * FROM users WHERE preferences ->> 'theme' = 'dark';

Or in Rails:

User.where("preferences ->> 'theme' = ?", 'dark')

store and store_accessor

They let you treat JSON or text-based hash columns like structured data, so you can access fields as if they were real database columns.

๐Ÿ”น store

  • Used to declare a serialized store (usually a jsonb, json, or text column) on your model.
  • Works best with key/value stores.

๐Ÿ‘‰ Example:

Letโ€™s say your users table has a settings column of type jsonb:

# migration
add_column :users, :settings, :jsonb, default: {}

Now in your model:

class User < ApplicationRecord
  store :settings, accessors: [:theme, :notifications], coder: JSON
end

You can now do this:

user.theme = "dark"
user.notifications = true
user.save

user.settings
# => { "theme" => "dark", "notifications" => true }

๐Ÿ”น store_accessor

A lightweight version that only declares attribute accessors for keys inside a JSON column. Doesnโ€™t include serialization logic โ€” so you usually use it with a json/jsonb/text column that already works as a Hash.

๐Ÿ‘‰ Example:

class User < ApplicationRecord
  store_accessor :settings, :theme, :notifications
end

This gives you:

  • user.theme, user.theme=
  • user.notifications, user.notifications=
๐Ÿค” When to Use Each?
FeatureWhen to Use
storeWhen you need both serialization and accessors
store_accessorWhen your column is already serialized (jsonb, etc.)

If you’re using PostgreSQL with jsonb columns โ€” it’s more common to just use store_accessor.

Querying JSON Fields
User.where("settings ->> 'theme' = ?", "dark")

Or if you’re using store_accessor:

User.where(theme: "dark")

๐Ÿ’ก But remember: youโ€™ll only be able to query these fields efficiently if youโ€™re using jsonb + proper indexes.


๐Ÿ”ฅ Conclusion:

  • PostgreSQL can store, search, and index inside JSON fields natively.
  • This lets you keep your schema flexible and your queries fast.
  • Combined with its advanced indexing, itโ€™s ideal for a modern e-commerce app with dynamic product attributes, filtering, and searching.

To install and set up PostgreSQL on macOS, you have a few options. The most common and cleanest method is using Homebrew. Hereโ€™s a step-by-step guide:

Setting Up โš™๏ธ SSH in your system

SSH (Secure Shell) is used to establish secure remote connections over an unsecured network, enabling secure access, management, and data transfer on remote systems, including running commands, transferring files, and managing applications.

Setup SSH keys:

To create an SSH key and add it to your GitHub account, follow these steps:

1. Generate an SSH Key

ssh-keygen -t ed25519 -C "your-email@example.com"
  • Replace "your-email@example.com" with your GitHub email.
  • If prompted, press Enter to save the key in the default location (~/.ssh/id_ed25519).
  • Set a passphrase (optional for security).

2. Start the SSH Agent

eval "$(ssh-agent -s)"

3. Add the SSH Key to the Agent

ssh-add ~/.ssh/id_ed25519

4. Copy the SSH Key to Clipboard

cat ~/.ssh/id_ed25519.pub | pbcopy   # macOS
cat ~/.ssh/id_ed25519.pub | xclip -selection clipboard   # Linux
clip < ~/.ssh/id_ed25519.pub   # Windows (Git Bash)

(If xclip is not installed, use sudo apt install xclip on Linux)


5. Add the SSH Key to GitHub

  • Go to GitHub โ†’ Settings โ†’ SSH and GPG keys (GitHub SSH Keys).
  • Click New SSH Key.
  • Paste the copied key into the field and give it a title.
  • Click Add SSH Key.

6. Test the SSH Connection

ssh -T git@github.com

You should see a message like:

Hi username! You've successfully authenticated, but GitHub does not provide shell access.

Now you can clone, push, and pull repositories without entering your GitHub password!

You may be wondering what is ed25519 ?

ed25519 is a modern cryptographic algorithm used for generating SSH keys. It is an alternative to the older RSA algorithm and is considered more secure and faster.

Why Use ed25519 Instead of RSA?

  1. Stronger Security โ€“ ed25519 provides 128-bit security, while RSA requires a 4096-bit key for similar security.
  2. Smaller Key Size โ€“ The generated keys are much shorter than RSA keys, making them faster to use.
  3. Faster Performance โ€“ ed25519 is optimized for speed, especially on modern hardware.
  4. Resistant to Certain Attacks โ€“ Unlike RSA, ed25519 is resistant to side-channel attacks.

Why GitHub Recommends ed25519?

  • Since 2021, GitHub suggests using ed25519 over RSA because of better security and efficiency.
  • Older RSA keys (e.g., 1024-bit) are now considered weak.

When Should You Use ed25519?

  • Always, unless you’re working with old systems that do not support it.
  • If you need maximum security, speed, and smaller key sizes.

Example: Creating an ed25519 SSH Key

ssh-keygen -t ed25519 -C "your-email@example.com"

This creates a strong and secure SSH key for GitHub authentication.

What is the SSH Agent?

The SSH agent is a background process that securely stores your SSH private keys and manages authentication.

Instead of entering your private key passphrase every time you use SSH (e.g., for git push), the agent remembers your key after you add it.


Why Do We Need the SSH Agent?

  1. Avoid Entering Your Passphrase Repeatedly
  • If your SSH key has a passphrase, you would normally need to enter it every time you use git push or ssh.
  • The agent caches the key in memory so you donโ€™t need to enter the passphrase every time.
  1. Automatic Authentication
  • Once the agent has your key, it can sign SSH requests for authentication automatically.
  1. Keeps Your Private Key Secure
  • Your private key stays in memory and is not exposed on disk after adding it.

What Do These Commands Do?

1๏ธโƒฃ Start the SSH Agent

eval "$(ssh-agent -s)"
  • This starts the SSH agent in the background.
  • The eval "$(ssh-agent -s)" command initializes the agent and prints its process ID (PID), confirming itโ€™s running.

2๏ธโƒฃ Add the SSH Key to the Agent

ssh-add ~/.ssh/id_ed25519
  • This adds your private key (~/.ssh/id_ed25519) to the agent.
  • Now, the agent will remember the key and use it when making SSH connections.

How to Check If Your Key is Added?

Run:

ssh-add -l

This lists all keys currently added to the agent. If your key is listed, it means the agent is working correctly.


Example: How This Helps?

Without the agent:

git push origin main
# Every time, you need to enter your SSH passphrase

With the agent:

ssh-add ~/.ssh/id_ed25519
git push origin main
# No need to enter your passphrase again!

Conclusion
  • The SSH agent stores your private key securely in memory.
  • It automates authentication, so you donโ€™t need to enter your passphrase for every SSH request.
  • Adding the key with ssh-add lets Git and SSH use it without prompting.

Now, you can git push and git pull seamlessly!

But the question is: Can this make unauthorized persons to access with this commands without ssh passphrase?


Does using ssh-agent make your system less secure? Let’s break it down.

Can Someone Else Use My SSH Key Without a Passphrase?

โœ… No, If Your System is Secure

  • The SSH agent only runs under your user session.
  • Other users on the same machine cannot access your agent.
  • Your private key is never written to disk in an unprotected way.
  • The agent only allows authentication, not direct access to your key.

โš ๏ธ Yes, If Someone Gets Access to Your Session

  • If an attacker gains access to your system (e.g., via malware, or if you leave your computer unlocked), they might use your active SSH agent to authenticate without needing your passphrase.
  • However, they cannot extract your private key from the agent.

How to Improve Security?

If you want extra security, here are a few things you can do:

1๏ธโƒฃ Remove Keys from Agent When Not Needed

After using your SSH key, you can remove it from the agent:

ssh-add -D

This removes all stored keys. Next time you push, you’ll need to re-enter your passphrase.


2๏ธโƒฃ Use -t (Timeout) for Auto Removal

To automatically remove the key after a set time:

ssh-add -t 3600 ~/.ssh/id_ed25519  # Removes the key after 1 hour


3๏ธโƒฃ Lock Your Screen When Away

If someone gets access to your logged-in session, they could use your agent to authenticate without needing the passphrase.

Always lock your screen (Ctrl + L or Win + L on Windows/Linux, Cmd + Ctrl + Q on Mac) when stepping away.


4๏ธโƒฃ Disable Agent Forwarding (Extra Security)

By default, SSH agent forwarding (ssh -A) can expose your keys to remote servers. If you don’t need it, disable it by editing:

nano ~/.ssh/config

And adding:

Host *
    ForwardAgent no

Summary
  1. The SSH agent only runs in your session, so no one else can access it unless they get control of your user session.
  2. Attackers cannot steal your private key from the agent, but if they have access to your session, they could use it.
  3. To be safe, remove keys when not needed (ssh-add -D), use timeouts (-t), and always lock your computer.

You’re now both secure and productive with SSH! ๐Ÿš€

Setup ๐Ÿ›  Rails 8 App โ€“ Partย 3: Git setup, modify gitignore, git config

So now let’s push the code to github repository. Before that there is some final checks need to be done from our end.

Check .gitignore file and update

โœ… Files/Folders to Include in .gitignore

Hereโ€™s a breakdown of which files/folders should be added to .gitignore in your Rails project:

These files are user-specific or environment-specific and should not be committed to Git.

1๏ธโƒฃ .dockerignore โ†’ โŒ Ignore from .gitignore

  • Keep this file if you’re using Docker.
  • Itโ€™s like .gitignore but for Docker, helping to reduce Docker image size.
  • Do not add it to .gitignore if you need it.

2๏ธโƒฃ .github/ โ†’ โœ… Add to .gitignore (If personal CI/CD configs)

  • If this contains GitHub Actions or issue templates, keep it in the repo.
  • If itโ€™s just for personal workflows, add it to .gitignore.

3๏ธโƒฃ .kamal/ โ†’ โœ… Add to .gitignore

  • This contains deployment secrets and configuration files for Kamal (deployment tool).
  • Itโ€™s usually auto-generated and should not be committed.

4๏ธโƒฃ .vscode/ โ†’ โœ… Add to .gitignore

  • User-specific VSCode settings, should not be committed.
  • Different developers use different editors.

Keep These Files in Git (Don’t Add to .gitignore)

These files are important for project configuration.

1๏ธโƒฃ .gitattributes โ†’ โŒ Keep in Git

  • Defines how Git handles line endings and binary files.
  • Helps avoid conflicts on Windows/Linux/Mac.

2๏ธโƒฃ .gitignore โ†’ โŒ Keep in Git

  • Defines ignored files, obviously should not be ignored itself.

3๏ธโƒฃ .rubocop.yml โ†’ โŒ Keep in Git

  • This is for Rubocop linting rules, which helps maintain coding style.
  • All developers should follow the same rules.

4๏ธโƒฃ .ruby-version โ†’ โŒ Keep in Git

  • Defines the Ruby version for the project.
  • Ensures all team members use the same Ruby version.

Final .gitignore Entries Based on Your Files

# Ignore log files, temp files, and dependencies
/log/
/tmp/
.bundle/
/node_modules/

# Manually added
# Ignore editor & environment-specific configs
.vscode/

# Ignore deployment configs
.kamal/

# Ignore personal GitHub configs (if applicable)
.github/

Final Summary

FolderInclude in Git?Why?
log/โŒ IgnoreDynamically generated logs
public/โœ… Keep (except public/assets/)Static files like favicon, error pages
script/โœ… KeepOld Rails script files (if used)
storage/โŒ IgnoreActiveStorage uploads (except seed/)
test/โœ… KeepContains important test cases
tmp/โŒ IgnoreTemporary runtime files
vendor/โŒ Ignore (except custom libraries)Third-party libraries

First time git setup

# You can view all of your settings and where they are coming from using

git config --list --show-origin

# Your Identity: The first thing you should do is to set your user name and email address.

git config --global user.name "Abhilash"
git config --global user.email abhilash@example.com

# configure the default text editor that will be used when Git needs you to type in a message

git config --global core.editor emacs
git config --global core.editor "code --wait"     # vs code
git config --global -e      # verify editor

# command to list all the settings Git can find at that point

git config --list
git config user.name

Add your ssh key to your github. Check the post: https://railsdrop.com/2025/03/30/setting-up-ssh-in-your-system/

Initial commit: Execute the Git commands

Run the following commands in your rails app folder:

git add .
git commit -m "first commit"
git branch -M main
git remote add origin git@github.com:abhilashak/design_studio.git
git remote -v     # check the remote endpoints
git push -u origin main

git log      # check commit details

What It Does:

git branch -M main
  1. Renames the current branch to main.
  2. The -M flag forcefully renames the branch (overwrites if needed).

Common Use Case:

  • If your branch is named master and you want to rename it to main (which is now the default in many repositories).
  • If you created a branch with a different name and want to standardise it as main.

Example Usage:

git branch -M main
git push -u origin main

This renames the current branch to main and then pushes it to the remote repository.

Use github ssh option: it checks the ssh key that you setup in your github account for that machine.
http option asks your github credentials to login.

The -u option in the command:

git push -u origin main

What Does -u Do?

The -u flag stands for --set-upstream. It sets the upstream branch for main, which means:

  • It links your local branch (main) to the remote branch (main on origin).
  • After running this command once, you can simply use:
  git push

instead of git push origin main, because Git now knows where to push.

Example Use Case:

If you just created a new branch (main in this case) and are pushing it for the first time:

git push -u origin main

This ensures that main always pushes to origin/main without needing to specify it every time.

After Running This Command:

โœ… Next time, you can simply use:

git push
git pull

without needing to specify origin main again.

For better git work flow

Check the post: https://railsdrop.com/2025/03/29/git-workflow-best-practices-for-your-development-process/

Want to See Your Upstream Branch?

Run:

git branch -vv

This shows which remote branch your local branches are tracking.

to be continued.. ๐Ÿš€


Setup ๐Ÿ›  Rails 8 App โ€“ Partย 2: Command line, VS Code, RuboCop, Server, Action text, Image processing

In the first part of this guide, we covered setting up a Rails 8 app with essential configurations. In this follow-up, weโ€™ll go over optimizing command-line usage, setting up VS Code for development, running migrations, styling the app, and enabling Action Text.


1. Optimizing Command-Line Usage with Aliases

One of the best ways to speed up development is to create shortcuts for frequently used commands. You can do this by adding aliases to your shell configuration.

Steps to Add an Alias:

  1. Open your shell configuration file:
    vim ~/.zshrc
  2. Search for the alias section:
    <esc> / alias <enter>
  3. Add your alias:
    alias gs="git status"
  4. Save and exit:
    <esc> :wq
  5. Reload your configuration:
    source ~/.zshrc
  6. Use your new alias:
    gs

This method saves time by allowing you to run frequently used commands more quickly.


2. Using Terminal Efficiently in VS Code

By default, VS Code uses `Ctrl + “ to toggle the terminal, which may not be intuitive. You can change this shortcut:

  1. Open VS Code.
  2. Go to Settings โ†’ Keyboard Shortcuts.
  3. Search for Toggle Terminal.
  4. Click Edit and change it to Ctrl + Opt + T for easier access.

3. Setting Up RuboCop in VS Code

RuboCop ensures your Ruby code follows best practices. Hereโ€™s how to set it up:

Checking RuboCop from the Terminal:

rubocop .

VS Code Setup:

  1. Open Command Palette (Cmd + Shift + P) and search for “Lint by RuboCop”.
  2. Go to Extensions Tab and install “VS Code RuboCop”.
  3. In VS Code Settings, search for “Rubocop” and check Ruby -> Rubocop -> Execute Path.
  4. Find the RuboCop installation path: whereis rubocop Example output: ~/.local/share/mise/installs/ruby/3.4.1/bin/rubocop/
  5. Update the Execute Path in VS Code to: ~/.local/share/mise/installs/ruby/3.4.1/bin/
  6. If RuboCop still returns an empty output, check .rubocop.yml in your project: ~/rails/design_studio/.rubocop.yml
  7. If the issue persists, ensure the gem is installed: gem install rubocop
  8. Restart VS Code from the Rails project root: code .

For more details, check the official documentation: RuboCop Usage


4. Running Migrations and Starting the Server

Running Migrations:

rails db:migrate -t

You can check the file: db/schema.rb You can see the active storage tables for attachments and other info.

Starting the Rails Server:

bin/dev

Access your application at: http://localhost:3000/

Check the product routes:


5. Adding Basic Styles

To quickly improve the appearance of your site, add this to application.html.erb:

<%= stylesheet_link_tag "https://cdn.simplecss.org/simple.css" %>

Alternatively, use Tailwind CSS for a more modern approach.

An example of convention over configurations in view file in Rails:

<div id="products">
  <%= render brand.products %>
</div>

Rails will look into the views/products/ folder and fetch right partial file that match.


6. Debugging with Rails Console

If an error occurs, you can inspect variables and data in the Rails console:

rails console


7. Installing Action Text for Rich Text Editing

Action Text is not installed by default but can be added easily:

rails action_text:install

This command:

  • Installs JavaScript dependencies
  • Creates necessary stylesheets
  • Generates database migrations for rich text

Running Migrations:

rails db:migrate -t

If you encounter an error about missing gems:

Could not find gem 'image_processing (~> 1.2)'

Run:

bundle install
rails db:migrate -t

Updating the Product Model:

Add this to app/models/product.rb:

has_rich_text :description

Updating the Form View:

In app/views/products/_form.html.erb, replace the description input with:

<%= form.rich_text_area :description %>

Now, visiting the new product page should display a rich text editor.


8. Solving Image Processing Errors

If you see an error like:

LoadError: Could not open library 'vips.42'

Install libvips to resolve it:

brew install vips


Conclusion

In this second part, we covered:

  • Optimizing terminal usage with aliases.
  • Configuring VS Code for efficient development.
  • Running migrations and starting the Rails server.
  • Enhancing the UI with styles.
  • Debugging using Rails console.
  • Installing and configuring Action Text for rich text support.

Stay tuned for more Rails 8 improvements!

Part 3: https://railsdrop.com/2025/03/25/setup-rails-8-app-git-setup-gitignore-part-3/

To be continued… ๐Ÿš€

Setup ๐Ÿ›  Rails 8 App – Part 1: Setup All Necessary Configurations | Ruby | Rails setup | Kamal | Rails Generations

Ruby on Rails 8 introduces several improvements that make development easier, more secure, and more maintainable. In this guide, we’ll walk through setting up a new Rails 8 application while noting the significant configurations and features that come out of the box.

1. Check Your Ruby and Rails Versions

If not installed Ruby 3.4 and Rails 8.0 please check: https://railsdrop.com/2025/02/11/installing-and-setup-ruby-3-rails-8-vscode-ide-on-macos-in-2025/

Before starting, ensure that you have the correct versions of Ruby and Rails installed:

$ ruby -v
ruby 3.4.1

$ rails -v
Rails 8.0.1

If you donโ€™t have these versions installed, update them using your package manager or version manager (like rbenv or RVM).

2. Create a New Rails 8 Application

Run the following command to create a new Rails app:

$ rails new design_studio

Noteworthy Files and Directories Created

Here are some interesting files and directories that are generated with a new Rails 8 app:

 create  .ruby-version
 create  bin/brakeman
 create  bin/rubocop
 create  bin/docker-entrypoint
 create  .rubocop.yml
 create  .github/workflows
 create  .github/workflows/ci.yml
 create  config/cable.yml
 create  config/storage.yml
 create  config/initializers/content_security_policy.rb
 create  config/initializers/filter_parameter_logging.rb
 create  config/initializers/new_framework_defaults_8_0.rb

Key Takeaways:

  • Security & Code Quality Tools: Brakeman (security scanner) and RuboCop (code style linter) are included by default.
  • Docker Support: The presence of bin/docker-entrypoint suggests better built-in support for containerized deployment.
  • GitHub Actions Workflow: The .github/workflows/ci.yml file provides default CI configurations.
  • Enhanced Security: The content_security_policy.rb initializer helps enforce a strict security policy.
  • New Rails Defaults: The new_framework_defaults_8_0.rb initializer helps manage breaking changes in Rails 8.

Rails automatically creates the following during the creation of the rails new app.

a. Configuring Import Maps and Installing Turbo & Stimulus

Rails 8 still defaults to Import Maps for JavaScript package management, avoiding the need for Node.js and Webpack:

$ rails turbo:install stimulus:install

This creates the following files:

create    config/importmap.rb
create    app/javascript/controllers
create    app/javascript/controllers/index.js
create    app/javascript/controllers/hello_controller.js
append    config/importmap.rb

Key Takeaways:

  • Import Maps: Defined in config/importmap.rb, allowing dependency management without npm.
  • Hotwired Support: Turbo and Stimulus are automatically configured for modern front-end development.
  • Generated Controllers: Stimulus controllers are pre-configured inside app/javascript/controllers/.

b. Deploying with Kamal

Kamal simplifies deployment with Docker and Kubernetes. Rails 8 includes built-in support:

$ bundle binstubs kamal
$ bundle exec kamal init

This results in:

Created .kamal/secrets file
Created sample hooks in .kamal/hooks

Key Takeaways:

  • Automated Deployment Setup: Kamal provides easy-to-use deployment scripts.
  • Secret Management: The .kamal/secrets file ensures secure handling of credentials.
  • Deployment Hooks: Custom hooks allow pre- and post-deployment scripts for automation.

c. Setting Up Caching and Queues with Solid Cache, Queue, and Cable

NOTE: Rails automatically creates this for you while creating the rails app.

Rails 8 includes Solid Cache, Solid Queue, and Solid Cable for enhanced performance and scalability:

$ rails solid_cache:install solid_queue:install solid_cable:install

This creates:

create  config/cache.yml
create  db/cache_schema.rb
create  config/queue.yml

Key Takeaways:

  • Caching Support: config/cache.yml manages application-wide caching.
  • Database-Powered Queue System: Solid Queue simplifies background job management without requiring external dependencies like Sidekiq.
  • Real-Time WebSockets: Solid Cable offers Action Cable improvements for real-time features.

3. Rails 8 Migration Enhancements

Rails 8 provides new shortcuts and syntax improvements for database migrations:

NOT NULL Constraints with ! Shortcut

You can impose NOT NULL constraints directly from the command line using !:

# Example for not null constraints: 
โžœ rails generate model User name:string!

Type Modifiers in Migrations

Rails 8 allows passing commonly used type modifiers directly via the command line. These modifiers are enclosed in curly braces {} after the field type.

# Example for model generation: 
โžœ rails generate model Product name:string description:text
# Example for passing modifiers: 
โžœ rails generate migration AddDetailsToProducts 'price:decimal{5,2}' supplier:references{polymorphic}

Generating a Scaffold for the Product Model

Let’s generate a complete scaffold for our Product model:

โœ— rails generate scaffold product title:string! description:text category:string color:string 'size:string{10}' 'mrp:decimal{7,2}' 'discount:decimal{7,2}' 'rating:decimal{1,1}'
โžœ  design_studio git:(main) โœ— rails -v
Rails 8.0.1
โžœ  design_studio git:(main) โœ— ruby -v
ruby 3.4.1 (2024-12-25 revision 48d4efcb85) +PRISM [arm64-darwin24]
โžœ  design_studio git:(main) โœ— rails generate scaffold product title:string! description:text category:string color:string 'size:string{10}' 'mrp:decimal{7,2}' 'discount:decimal{7,2}' 'rating:decimal{1,1}'

Using the Rails Resource Generator

The rails g resource command is a powerful way to generate models, controllers, migrations, and routes all in one go. This is particularly useful when setting up RESTful resources in a Rails application.

Basic Syntax

โžœ rails g resource product

This command creates the necessary files for a new resource, including:

  • A model (app/models/product.rb)
  • A migration file (db/migrate/)
  • A controller (app/controllers/product_controller.rb)
  • Routes in config/routes.rb
  • A test file (test/controllers/product_controller_test.rb or spec/)

Example Usage

To generate a Post resource with attributes:

โžœ rails g resource Product title:string! description:text brand:references

This will:

  1. Create a Product model with title and description attributes.
  2. Add a brand_id foreign key as a reference.
  3. Apply a NOT NULL constraint on title (! shortcut).
  4. Generate a corresponding migration file.
  5. Set up routes automatically (resources :products).

Running the Migration

After generating a resource, apply the migration to update the database:

โžœ rails db:migrate


Difference Between resource and scaffold

Rails provides both rails g resource and rails g scaffold, but they serve different purposes:

Featurerails g resourcerails g scaffold
Generates a Modelโœ…โœ…
Generates a Migrationโœ…โœ…
Generates a Controllerโœ… (empty actions)โœ… (full CRUD actions)
Generates Views (HTML/ERB)โŒโœ… (index, show, new, edit, etc.)
Generates Routesโœ…โœ…
Generates Helper FilesโŒโœ…
Generates Testsโœ…โœ…
  • rails g resource is minimalโ€”it generates only essential files without view templates. It’s useful when you want more control over how your views and controller actions are built.
  • rails g scaffold is more opinionated and generates full CRUD functionality with prebuilt forms and views, making it ideal for rapid prototyping.

If you need full CRUD functionality quickly, use scaffold. If you prefer a leaner setup with manual control over implementation, use resource.

Conclusion

Rails 8 significantly enhances the development experience with built-in security tools, CI/CD workflows, streamlined deployment via Kamal, and modern front-end support with Turbo & Stimulus. It also improves caching, background jobs, and real-time features with Solid tools.

These improvements make Rails 8 a robust framework for modern web applications, reducing the need for additional dependencies while keeping development efficient and secure.

Enjoy Rails! ๐Ÿš€

Setup Nginx, SSL , Firewall | Moving micro-services into AWS EC2 instance โ€“ Part 4

Install Nginx proxy server. Nginx also act like a load-balacer which is helpful for the balancing of network traffic.

sudo apt-get update
sudo apt-get install nginx

Commands to stop, start, restart, check status

sudo systemctl stop nginx
sudo systemctl start nginx
sudo systemctl restart nginx

# after making configuration changes
sudo systemctl reload nginx
sudo systemctl disable nginx
sudo systemctl enable nginx

Install SSL – Letsencrypt

Install packages needed for ssl

sudo add-apt-repository ppa:certbot/certbot
sudo apt-get update
sudo apt-get install python-certbot-nginx

Install the SSL Certificate:

certbot -d '*.domain.com' -d domain.com --manual --preferred-challenges dns certonly

Your certificate and chain have been saved at:
   /etc/letsencrypt/live/domain.com/fullchain.pem

Your key file has been saved at:
   /etc/letsencrypt/live/domain.com/privkey.pem
SSL certificate auto renewal

Let’s Encrypt’s certificates are valid for 90 days. To automatically renew the certificates before they expire, the certbot package creates a cronjob which will run twice a day and will automatically renew any certificate 30 days before its expiration.

Since we are using the certbot webroot plug-in once the certificate is renewed we also have to reload the nginx service. To do so append –renew-hook “systemctl reload nginx” to the /etc/cron.d/certbot file so as it looks like this:

/etc/cron.d/certbot
0 */12 * * * root test -x /usr/bin/certbot -a \! -d /run/systemd/system && perl -e 'sleep int(rand(3600))' && certbot -q renew --renew-hook "systemctl reload nginx"

To test the renewal process, use the certbot –dry-run switch:

sudo certbot renew --dry-run

Renew your EXPIRED certificate this way:

sudo certbot --force-renewal -d '*.domain.com' -d domain.com --manual --preferred-challenges dns certonly

Are you OK with your IP being logged?
- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
(Y)es/(N)o: Y

- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
Please deploy a DNS TXT record under the name
_acme-challenge.<domain>.com with the following value:

O3bpxxxxxxxxxxxxxxxxxxxxxxxxxxY4TnNo

Before continuing, verify the record is deployed.
- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
Press Enter to Continue

You need to update the DNS txt record for _acme-challenge.<domain>.com

sudo systemctl restart nginx # restart nginx to take effect

Configure the Firewall

Next, weโ€™ll update our firewall to allow HTTPS traffic.

Check firewall status in the system. If it is inactive enable firewall.

sudo ufw status # check status

# enable firewall
sudo ufw enable
sudo ufw allow ssh
sudo ufw allow OpenSSH

Enable particular ports where your micro-services are running. Example:

sudo ufw allow 4031/tcp # Authentication service
sudo ufw allow 4131/tcp # File service
sudo ufw allow 4232/tcp # Search service

You can delete the ‘Authentication service’ firewall rule by:

sudo ufw delete allow 4031/tcp

Setup Ruby, ruby-build, rbenv-gemset | Conclusion – Moving micro-services into AWS EC2 instance โ€“ Part 3

In this post let’s setup Ruby and ruby gemsets for each project, so that your package versions are maintained.

Install ruby-build # ruby-build is a command-line utility for rbenv

git clone https://github.com/rbenv/ruby-build.git ~/.rbenv/plugins/ruby-build

# Add ruby build path

echo 'export PATH="$HOME/.rbenv/plugins/ruby-build/bin:$PATH"' >> ~/.bashrc # OR
echo 'export PATH="$HOME/.rbenv/plugins/ruby-build/bin:$PATH"' >> ~/.zshrc

# load it

source ~/.bashrc # OR
source ~/.zshrc


For Mac users – iOS users


# verify rbenv
curl -fsSL https://github.com/rbenv/rbenv-installer/raw/main/bin/rbenv-doctor | bash

If you are using zsh add the following to `~/.zshrc`

# rbenv configuration
eval "$(rbenv init -)"
export RUBY_CONFIGURE_OPTS="--with-openssl-dir=$(brew --prefix openssl@1.1)"

Install Ruby 2.5.1 using rbenv

rbenv install 2.5.1

rbenv global 2.5.1 # to make this version as default

ruby -v # must display 2.5.1 if installed correctly

which ruby # must show the fully qualified path of the executable

echo "gem: --no-document" > ~/.gemrc # to skip documentation while installing gem

rbenv rehash # latest version of rbenv apparently don't need this. Nevertheless, lets use it to avoid surprises.

gem env home # See related details

# If a new version of ruby was installed, ensure RubyGems is up to date.
gem update --system --no-document
๏ปฟ

Install rbenv gemset – https://github.com/jf/rbenv-gemset

git clone git://github.com/jf/rbenv-gemset.git ~/.rbenv/plugins/rbenv-gemset

If you are getting following issue:

fatal: remote error:
  The unauthenticated git protocol on port 9418 is no longer supported.
# Fix
 git clone https://github.com/jf/rbenv-gemset.git ~/.rbenv/plugins/rbenv-gemset

Now clone your project and go inside the project folder -Micro-service folder (say my-project) which has Gemfile in it and do the following commands.

cd my-project

my-project $ rbenv gemset init # NOTE: this will create the gemset under the current ruby version.

my-project $ rbenv gemset list # list all gemsets

my-project $ rbenv gemset active # check this in project folder

my-project $ gem install bundler -v '1.6.0'

my-project $ rbenv rehash

my-project $ bundle install  # install all the gems for the project inside the gemset.

my-project $ rails s -e production # start rails server
my-project $ puma -e production -p 3002 -C config/puma.rb # OR start puma server
# OR start the server you have configured with rails. 

Do this for all the services and see how this is running. The above will install all the gems inside the project gemset that acts like a namespace.

So our aim is to setup all the ruby micro-services in the same machine.

  • I started 10 services together in AWS EC2 (type: t3.small).
  • Database is running in t2.small instance with 2 volumes (EBS) attached.
  • For Background job DB (redis) is running in t2.micro instance.

So for 3 ec2 instance + 2 EBS volumes –$26 + elastic IP addresses ( aws charges some amount – $7.4) 1 month duration, it costs me around $77.8, almost 6k rupees. That means we reduced the aws-cloud cost to half of the previous cost.

Setup Rspec, factory bot and database cleaner for Rails 5.2.6

To configure the best test suite in Rails using the RSpec framework and other supporting libraries, such as Factory Bot and Database Cleaner, we’ll remove the Rails native test folder and related configurations.

To begin, we’ll add the necessary gems to our Gemfile:

group :development, :test do
  # Rspec testing module and needed libs
  gem 'factory_bot_rails', '5.2.0'
  gem 'rspec-rails', '~> 4.0.0'
end

group :test do
  # db cleaner for test suite 
  gem 'database_cleaner-active_record', '~> 2.0.1'
end

Now do

bunde install # this installs all the above gems

If your Rails application already includes the built-in Rails test suite, you’ll need to remove it in order to use the RSpec module instead.

I recommend using RSpec over the Rails native test module, as RSpec provides more robust helpers and mechanisms for testing.

To disable the Rails test suite, navigate to the application.rb file and comment out the following line:

# require 'rails/test_unit/railtie'

inside the class Application add this line:

# Don't generate system test files.
config.generators.system_tests = nil

Remove the native rails test folder:

rm -r test/

We use factories over fixtures. Remove this line from rails_helper.rb

config.fixture_path = "#{::Rails.root}/spec/fixtures"

and modify this line to:

config.use_transactional_fixtures = false # instead of true

This is for preventing rails to generate the native test files when we run rails generators.

Database Cleaner

Now we configure the database cleaner that is used for managing data in our test cycles.

Open rails_helper.rb file and require that module

require 'rspec/rails'
require 'database_cleaner'  # <= add here

Note: Use only if you run integration tests with capybara or dealing with javascript codes in the test suite.

“Capybara spins up an instance of our Rails app that canโ€™t see our test data transaction so even tho weโ€™ve created a user in our tests, signing in will fail because to the Capybara run instance of our app, there are no users.”

I experienced database credentials issues:

โžœ rspec
An error occurred while loading ./spec/models/user_spec.rb.
Failure/Error: ActiveRecord::Migration.maintain_test_schema!

Mysql2::Error::ConnectionError:
  Access denied for user 'username'@'localhost' (using password: NO)

Initially, I planned to use Database Cleaner, but later I realized that an error I was experiencing was actually due to a corrupted credentials.yml.enc file. I’m not sure how it happened.

To check if your credentials are still intact, try editing the file and verifying that the necessary information is still present.

EDITOR="code --wait" bin/rails credentials:edit

Now in the Rspec configuration block we do the Database Cleaner configuration.

Add the following file:

spec/support/database_cleaner.rb

Inside, add the following:

# DB cleaner using database cleaner library
RSpec.configure do |config|
  # This says that before the entire test suite runs, clear 
  # the test database out completely
  config.before(:suite) do
    DatabaseCleaner.strategy = :transaction
    DatabaseCleaner.clean_with(:truncation)
  end

  # This sets the default database cleaning strategy to 
  # be transactions
  config.before(:each) do
    DatabaseCleaner.strategy = :transaction
  end

  # include this if you uses capybara integration tests
  config.before(:each, :js => true) do
    DatabaseCleaner.strategy = :truncation
  end

  # These lines hook up database_cleaner around the beginning 
  # and end of each test, telling it to execute whatever 
  # cleanup strategy we selected
  config.before(:each) do
    DatabaseCleaner.start
  end

  config.after(:each) do
    DatabaseCleaner.clean
  end
end

and be sure to require this file in rails_helper.rb

require 'rspec/rails'
require 'database_cleaner'
require_relative 'support/database_cleaner'  # <= here

Configure Factories

Note: We use factories over fixtures because factories provide better features that make writing test cases an easy task.

Create a folder to generate the factories:

mkdir spec/factories

Rails generators will automatically generate factory files for models inside this folder.

A generator for model automatically creating the following files:

spec/models/model_spec.rb
spec/factories/model.rb

Now lets load Factory bot configuration to rails test suite.

Add the following file:

spec/support/factory_bot.rb

and be sure to require this file in rails_helper.rb

require 'rspec/rails'
require 'database_cleaner'
require_relative 'support/database_cleaner'
require_relative 'support/factory_bot'  # <= here

You can see the following line commented

# Dir[Rails.root.join('spec', 'support', '**', '*.rb')].sort.each { |f| require f }

You can uncomment the line to make all factories available in your test suite, but I don’t recommend this approach as it can slow down test execution. Instead, it’s better to load each factory as needed.

Here’s the final version of the rails_helper.rb file. Note that we won’t be using Capybara for integration tests, so we’re not including the database_cleaner configuration:

# This file is copied to spec/ when you run 'rails generate rspec:install'
require 'spec_helper'
ENV['RAILS_ENV'] ||= 'test'
require File.expand_path('../config/environment', __dir__)
# Prevent database truncation if the environment is production
abort('The Rails environment is running in production mode!') if Rails.env.production?
require 'rspec/rails'
require_relative 'support/factory_bot'

# Checks for pending migrations and applies them before tests are run.
# If you are not using ActiveRecord, you can remove these lines.
begin
  ActiveRecord::Migration.maintain_test_schema!
rescue ActiveRecord::PendingMigrationError => e
  puts e.to_s.strip
  exit 1
end
RSpec.configure do |config|
  # If you're not using ActiveRecord, or you'd prefer not to run each of your
  # examples within a transaction, remove the following line or assign false
  # instead of true.
  config.use_transactional_fixtures = false

  config.infer_spec_type_from_file_location!

  # Filter lines from Rails gems in backtraces.
  config.filter_rails_from_backtrace!
  # arbitrary gems may also be filtered via:
  # config.filter_gems_from_backtrace("gem name")
end

A spec directory look something like this:

spec/
  controllers/
    user_controller_spec.rb
    product_controller_spec.rb
  factories/
    user.rb
    product.rb
  models/
    user_spec.rb
    product_spec.rb
  mailers/
    mailer_spec.rb
  services/
    service_spec.rb  
  rails_helper.rb
  spec_helper.rb

References:

https://github.com/rspec/rspec-rails
https://relishapp.com/rspec/rspec-rails/docs
https://github.com/thoughtbot/factory_bot/blob/master/GETTING_STARTED.md#configure-your-test-suite
https://github.com/DatabaseCleaner/database_cleaner

Model Specs

Lets generate a model spec. A model spec is used to test smaller parts of the system, such as classes or methods.

# RSpec also provides its own spec file generators
โžœ rails generate rspec:model user
      create  spec/models/user_spec.rb
      invoke  factory_bot
      create    spec/factories/users.rb

Now run the rpsec command. That’s it. You can see the output from rspec.

โžœ rspec
*

Pending: (Failures listed here are expected and do not affect your suite's status)

  1) Item add some examples to (or delete) /home/.../spec/models/user_spec.rb
     # Not yet implemented
     # ./spec/models/user_spec.rb:4

Finished in 0.00455 seconds (files took 1.06 seconds to load)
1 example, 0 failures, 1 pending

Lets discuss how to write a perfect model spec in the next lesson.

AWS SNS: How to send SMS to a topic, OTP SMS, promotional or transactional SMS

You can send SMS in 2 ways using AWS SNS.

  1. Subscribe to a topic that is already created in AWS SNS and send sms to all numbers who has the subscription.
  2. Send SMS directly to a mobile number.

You can find the following aws doc as a starting point from web and it describes how to create a topic, subscribe a topic and sending sms to the mobile numbers.

https://docs.aws.amazon.com/code-samples/latest/catalog/code-catalog-ruby-example_code-sns.html

For the demonstration purpose, I use Ruby here.

  1. Subscribe to a topic and send SMS
require 'aws-sdk-sns'  # v2: require 'aws-sdk'

sns = Aws::SNS::Resource.new(region: 'us-west-2')

topic = sns.topic('arn:aws:sns:us-west-2:123456789:MySampleTextTopic')

topic.publish({
  message: 'Hello!'
})

This assumes you already created a topic inside your aws console:

Goto AWS Console

  1. Open AWS SNS
  2. Goto Left side -> Mobile -> text messaging sms
  3. Create a topic

You can also follow the above link to perform an API to create topic, subscribe to a topic and send sms

2. Send SMS directly to a mobile number (eg: Send OTP SMS)

Either you can use aws console to send the sms or SNS API.

AWS console:

  1. Open AWS SNS
  2. Goto Left side -> Mobile -> text messaging sms
  3. We are using transactional text messages
  4. Goto publish text message
  5. Select transactional, add mobile number and publish it

Almost all cases we use an API for sending OTP SMS. For that follow the steps.

SNS API – Steps

gem install aws-sdk-sns 
require 'aws-sdk-sns'
otp = generate_otp
set_sns_client
set_sns_client_attrs
response = publish_sms(otp)
  • Generate an OTP
def generate_otp
    (1000..9999).to_a.sample
end
  • Set SNS client
def set_sns_client
    @sns_client = Aws::SNS::Client.new(
      region: ENV['AWS_SNS_REGION'],
      access_key_id: ENV['AWS_SNS_ACCESS_KEY'],
      secret_access_key: ENV['AWS_SNS_SECRET_KEY']
    )
end

  • Set SNS client attributes
 def set_sns_client_attrs
    @sns_client.set_sms_attributes({
                                     attributes: {
                                       'DefaultSenderID' => SENDER_ID,
                                       'DefaultSMSType' => SMS_TYPE
                                     }
                                   })
end

  • Publish SMS
def publish_sms(otp)
    @sns_client.publish({
                          phone_number: @mobile_no,
                          message: "#{OTM_MSG} #{otp}"
                        })
  end

Here ,

OTM_MSG: ‘Your OTP for login is’

SMS_TYPE : ‘Transactional’ or ‘Promotional’

If you want to send OTP, then it is ‘Transactional’. Else if you want to send some promotional sms of your product then it is ‘Promotional’

SENDER_ID: is the sms header that you already registered with TRAI.

The Steps to add Sender ID in AWS is given below:

For example suppose your sender id is: Zomato

Follow the steps to add our SENDER ID – Zomato to AWS SNS

  1. Sign in to the Amazon SNS console – https://console.aws.amazon.com/sns/home
  2. On the navigation panel, choose Mobile, Text messaging (SMS).
  3. On the Mobile text messaging (SMS) page, in the Text messaging preferences section, choose Edit.
  4. On the Edit text messaging preferences page, in the Details section, do the following:
  5. For Default sender ID , enter the provided sender ID to be used (Zomato) as the default for all messages from your account.
  6. Choose Save changes.

If you don’t want to register sender id, then skip this method: set_sns_client_attrs and publish the sms. It take the sms as ‘Promotional’ and sender id will be 8 character random number. Amazon use this type of sms from International route and it costs you almost $0.02 (Rs. 1.5) per sms. Very high rate. So I recommend to register any sender id that resembles your product or company name, from Jio trueconnect (that is free, link given below) and use it in SNS.

If you don’t know how to register sender id, follow this:

For AWS SNS service, there is 2 way of sending sms.

  1. Local route
  2. International route

For local route the price is Rs. 0.20 per sms
For international route the price will be Rs 1.58 per sms – too high

by default AWS SNS use International route

If you are from India follow the TRAI registration
For considering local route we have to register our use case and message templates with TRAI .

So first register here:
https://www.vilpower.in/
as an enterprise / company with all company details and our purpose

These registration requirements are designed to reduce the number of unsolicited messages that Indian consumers receive, and to protect consumers from potentially harmful messages

You can Register in Jio for free:

The link to register:
https://trueconnect.jio.com/#/home/entity-registration
Select Principal entity and continue

Recently Indian Govt made DLT Registration mandatory for sending sms.

Example:
Take Msg from AD-ZOMATO , here ZOMATO is 6 char sender id that we can give in the service provider and send sms before. But now we have to register this in DLT then only our service provider can use this.

After registering DLT we get an ENTITY ID. This entity id need to be attached in our’s otp service provider for sending otp msgs.

If you are using SNS service for the first time you should increase your SMS quota:

AWS says:

If you're new to SMS messaging with Amazon SNS, request a monthly SMS spending threshold that meets the expected demands of your SMS use case. By default, your monthly spending threshold is $1.00 (USD). You can request to increase your spending threshold in the same support case that includes your request for a sender ID

Because Amazon SNS is a distributed system, it stops sending SMS messages within minutes of the spending quota being exceeded. During this period, if you continue to send SMS messages, you might incur costs that exceed your quota.

https://docs.aws.amazon.com/sns/latest/dg/channels-sms-awssupport-sender-id.html

Requesting increases to your monthly SMS spending quota for Amazon SNS:

https://docs.aws.amazon.com/sns/latest/dg/channels-sms-awssupport-spend-threshold.html

Currently, Amazon SNS supports SMS messaging in the following AWS Regions:

https://docs.aws.amazon.com/sns/latest/dg/sns-supported-regions-countries.html

Reference:

https://docs.aws.amazon.com/sdk-for-ruby/v2/api/Aws/SNS/Client.html