The Python ecosystem has grown to serve a vast variety of users and domains, becoming the de-facto glue language in software development. This versatility has also made it the preferred beginner language. You can find support tools and libraries for just about anything in Python: Web development (Django, Flask, FastAPI), AI/ML (PyTorch, NumPy, pandas), GUI development (tkinter, PyQt), game development, automation, data science, and the list goes on. There's always more than one way to accomplish something, and users often have domain-specific preferences (like Anaconda for Data Science or specific frameworks for web development).
Python has earned its reputation as the second best language for any problem – it's good enough for most use cases and, more importantly, fast enough to develop in that it becomes an appealing choice for getting things done quickly.
Naturally, this flexibility comes with a certain level of ambiguity and messiness when it comes to Python developer experience (DX) tools, guidelines, and best practices. Many developers skip these practices entirely – after all, fast and messy development is probably what drew them to Python in the first place. Beginners, especially, are often unaware that such development practices even exist, let alone how to implement them effectively.
This creates a paradox: Python's strength (rapid development) can become a weakness when projects grow in complexity or when working in teams. What starts as a quick script can evolve into a critical application, and without proper tooling and practices, maintenance becomes a nightmare.
In this post, I've compiled what I've learned and curated for my Python development setup, practices, and environment. My perspective leans heavily toward web development, but these tools and practices are valuable regardless of your domain. Whether you're building APIs, analyzing data, or creating desktop applications, having a solid development foundation will pay dividends as your projects grow and evolve.
This is a comprehensive guide covering the entire Python development ecosystem—from basic tooling to advanced observability. It's designed as a reference rather than a linear read. Feel free to jump to sections that interest you or bookmark this for future reference when setting up new projects.
For quick experiments, a plain venv works. For collaborative work, uv gives you one fast, consistent CLI that covers environments, dependencies, lockfiles, Python installs, tools, and even publishing—replacing a grab bag of pip/pip-tools/pipx/pyenv/poetry/twine steps.
UV is fast (Rust), has a clear command set, supports modern standards (PEP 723 scripts, PEP 735 dependency groups), and fits both apps and packages.
Let’s take a web-app workflow and then call out a few “power” features.
Shell
uv init --app
uv python pin 3.13 # writes .python-version for this project
# optional: uv python install 3.13 # uv will auto-install on demand anyway
‍
uv init scaffolds an application project; use uv python pin to set the interpreter version for the project. UV can automatically download needed Python versions when you run commands.
Shell
uv add django djangorestframework
uv add --dev pre-commit ruff
uv add --group docker gunicorn
‍
--dev and --group put packages into the right places in pyproject.toml under PEP 735 groups.
Shell
uv sync # install per pyproject + uv.lock (creates lock if missing)
uv sync --frozen # use lockfile strictly; don’t update it
‍
Single lockfile across multiple packages; run/sync at the workspace root; declare internal deps via tool.uv.sources = {pkg = { workspace = true }}.
Example root snippet:
None
[tool.uv.workspace]
members = ["apps/*", "libs/*"]
[tool.uv.sources]
mylib = { workspace = true }
‍
This keeps app/lib versions coordinated and editable.
Install from existing files:
Shell
uv pip install -r requirements.txt
‍
Generate a pinned requirements.txt from your project (useful for infra that expects it):
Shell
uv pip compile pyproject.toml -o requirements.txt
‍
Or export your uv.lock into requirements.txt:
Shell
uv export --no-hashes --format requirements-txt > requirements.txt
‍
(The docs recommend not maintaining both unless you must.)
Install/list/switch versions; pin per-project or globally:
Shell
uv python install 3.13
uv python pin 3.13
uv python list
‍
UV auto-downloads versions on demand, so the explicit install is often unnecessary.
Run ephemeral tools fast with the uvx alias:
Shell
uvx ruff --version
‍
Or install them on your PATH:
Shell
uv tool install ruff
‍
This isolates tools from your project yet keeps them handy on the CLI.
Self-contained scripts run with their declared deps:
Python
#!/usr/bin/env -S uv run --script
# /// script
# dependencies = ["httpx", "rich"]
# ///
‍
uv run script.py builds an ephemeral env from that header. You can even lock a script with uv lock --script.
Ship packages without separate tools:
Shell
uv build
uv publish # upload to PyPI or another index
‍
uv.lock captures platform markers so one lock works across OS/arch/Python versions—useful for mixed macOS/Linux teams and CI.
direnv auto-loads project-specific env on cd and unloads on exit, so you don’t pollute your global shell or keep re-sourcing files. Every time you hit enter in the terminal, direnv checks whether the current folder has an approved .envrc (or .env) file. If it does, it loads or unloads the environment variables accordingly.
Setup (once): install and add the shell hook, e.g. eval "$(direnv hook zsh)" in ~/.zshrc (or bash in ~/.bashrc), then restart your shell.
My minimal .envrc for uv projects:
Shell
dotenv_if_exists # load .env if present
uv sync --frozen # ensure deps match uv.lock, no updates
source .venv/bin/activate
Approve it with direnv allow (you’ll re-allow after edits; that’s the safety valve).
Nice extras (keep it lean):
Taskfile is a lightweight, cross-platform tool where you can collect small automation scripts inside a single YAML file. It supports caching and can automatically re-run tasks when files change. Install via Homebrew/Snap/Go; then run tasks with task <name>. More legible than makefile
‍Minimal Taskfile.yml
None
version: '3'
dotenv: ['.env'] # load project env if present
tasks:
setup:
desc: Initial project setup
cmds:
- uv sync --frozen
- uv run python manage.py migrate
- uv run python manage.py collectstatic --noinput
dev:
desc: Start Django dev server
cmds:
- uv run python manage.py runserver 0.0.0.0:8000
sources: ["**/*.py", "pyproject.toml", "uv.lock"] # enable watch
# run: task dev --watch
test:
desc: Run tests
cmds:
- uv run python manage.py test
sources: ["**/*.py", "pyproject.toml", "uv.lock"]
makemigrations:
desc: Make new migrations
cmds:
- uv run python manage.py makemigrations
migrate:
desc: Apply migrations
cmds:
- uv run python manage.py migrate
Usage:
Task is boring-in the good way. One file, readable YAML, fast feedback with --watch, and fewer bespoke shell scripts cluttering your repo.
Replace pyenv, direnv, and your task runner with one fast tool that manages Python versions, environment variables, and development tasks through a simple TOML config. Mise automatically switches environments when you enter project directories and can run hooks to automate your workflow.
Basic mise.toml for a Django project:
None
[tools]
python = "3.11"
node = "20" # for frontend assets
redis = "7" # local development
[env]
# Load .env automatically
_.file = ".env"
DEBUG = "True"
DJANGO_SETTINGS_MODULE = "myproject.settings.dev"
# Mise can also auto-create a project-specific virtual environment—just set the path (for example, .venv) in the configuration.
_.python.venv = { path = ".venv", create = true }
[tasks.dev]
description = "Start development server"
run = "python manage.py runserver"
[tasks.setup]
description = "Initial project setup"
run = [
"pip install -r requirements.txt",
"python manage.py migrate"
]
# Hooks - run commands automatically on directory enter/exit
[hooks]
enter = "echo 'Entering {{ config_root }}'"
leave = "echo 'Leaving project'"
‍Usage:
Nice extras:
⚠️ Fair warning: Mise is relatively new (2023+) and still gaining traction. While actively developed and fast, it has a smaller community than established tools like pyenv. Worth trying for new projects, but consider the ecosystem maturity for critical production workflows.
Clean code isn't just aesthetics—it prevents subtle bugs, enforces shared conventions, and keeps diffs small. Historically you wired up Black for formatting, isort for imports, and Flake8 plus plugins for linting. Ruff collapses that stack into one fast tool: it implements 800+ lint rules (covering Flake8 and popular plugins) and ships a Black-compatible formatter, so you configure everything once in pyproject.toml and get consistent style and static checks at Rust speed.
Here's a snippet for ruff configuration I use in my projects
None
[tool.ruff]
# Paths and files you don’t want Ruff to lint or format
exclude = [
".bzr",
".direnv",
".eggs",
".git",
".git-rewrite",
".hg",
".ipynb_checkpoints",
".mypy_cache",
".nox",
".pants.d",
".pyenv",
".pytest_cache",
".pytype",
".ruff_cache",
".svn",
".tox",
".venv",
".vscode",
"__pypackages__",
"_build",
"buck-out",
"build",
"dist",
"node_modules",
"site-packages",
"venv",
]
# Maximum line length for code
line-length = 88
# Number of spaces per indentation level
indent-width = 4
# Minimum Python version to target for syntax
target-version = "py312"
[tool.ruff.lint]
# Rule sets to enable
select = [
"E", # pycodestyle errors
"F", # Pyflakes
"W", # pycodestyle warnings
"C90", # McCabe complexity
"I", # isort (import sorting)
"N", # pep8-naming
"B", # flake8-bugbear (common bugs)
"UP", # pyupgrade (Python syntax upgrades)
"S", # flake8-bandit (security)
"A", # flake8-builtins (builtin shadowing)
"T10", # flake8-debugger (leftover debug code)
"ISC", # flake8-implicit-str-concat
"ICN", # flake8-import-conventions
"PIE", # flake8-pie (miscellaneous lints)
"PYI", # flake8-pyi (type stubs)
"RSE", # flake8-raise (exception raising)
"SIM", # flake8-simplify (code simplification)
]
# Specific rules to ignore
ignore = [
"E501", # Line too long (handled by formatter)
"E203", # Whitespace before ':' (Black compatibility)
"W503", # handle line breaks around binary operators (to match Black’s formatting style)
]
# Allow Ruff to automatically fix these violations
fixable = ["ALL"]
# Prevent Ruff from auto-fixing these (empty = fix everything fixable)
unfixable = []
# Regex for valid unused variable names (e.g., _unused)
dummy-variable-rgx = "^(_+|(_+[a-zA-Z0-9_]*[a-zA-Z0-9]+?))$"
[tool.ruff.lint.mccabe]
# Maximum cyclomatic complexity allowed for functions/methods
max-complexity = 10
[tool.ruff.lint.isort]
# List your project's first-party packages here
known-first-party = []
# List explicit third-party packages if needed
known-third-party = []
[tool.ruff.lint.per-file-ignores]
# __init__.py files often re-export imports
"__init__.py" = ["F401", "F403"] # F401: unused import, F403: wildcard import
# Test files can use assert and hardcoded test credentials
"tests/*" = ["S101", "S105"] # S101: assert used, S105: hardcoded passwords
[tool.ruff.format]
# Use double quotes for strings
quote-style = "double"
# Use spaces for indentation (not tabs)
indent-style = "space"
# Preserve trailing commas (Black-compatible behavior)
skip-magic-trailing-comma = false
# Auto-detect line endings (LF on Unix, CRLF on Windows)
line-ending = "auto"
# Format code examples in docstrings
docstring-code-format = true
# Allow docstring code to exceed line length when necessary
docstring-code-line-length = "dynamic"
Ruff gives you consistency; pre-commit makes it non-optional. It’s a multi-language Git-hook manager that installs and runs your chosen checks automatically at commit time (and other stages, if you configure them), giving fast feedback before code ever hits CI. You list hooks in .pre-commit-config.yaml, and the tool handles isolated environments for each hook—even bootstrapping the runtimes they need.
Concretely: you can keep lints/formatting quick in the pre-commit stage, and push slower checks (e.g., security scans) to pre-push or other stages via the stages: setting. Git itself runs these hooks before creating a commit; a non-zero exit blocks the commit (developers can still bypass with --no-verify, which is why CI should also run them).
My current config (annotated):
None
fail_fast: true
default_language_version:
python: python3.12
exclude: |
(?x)(
^.*/migrations/.*\.py$|
^.venv/.*|
^node_modules/.*|
^static/vendor/.*|
^media/.*|
.*\.(png|jpg|jpeg|gif|svg|ico|woff|woff2|ttf|eot)$
)
repos:
- repo: https://github.com/astral-sh/ruff-pre-commit
rev: v0.8.4
hooks:
- id: ruff
types_or: [python, pyi, jupyter]
args: ["check", "--select", "I", "--fix"]
- id: ruff-format
types_or: [python, pyi, jupyter]
- repo: https://github.com/astral-sh/uv-pre-commit
rev: 0.5.8
hooks:
- id: uv-lock
- id: uv-export
args: ["--frozen","--no-dev", "--no-hashes", "--output-file=requirements/requirements.txt"]
- id: uv-export
args: ["--frozen", "--no-dev", "--no-hashes", "--group", "docker", "--output-file=requirements/docker-requirements.txt"]
- id: uv-export
args: ["--frozen", "--no-hashes", "--output-file=requirements/dev-requirements.txt"]
- repo: https://github.com/gruntwork-io/pre-commit
rev: v0.1.23
hooks:
- id: helmlint
- repo: https://github.com/pre-commit/pre-commit-hooks
rev: v4.5.0
hooks:
- id: check-yaml
files: ^(?!deployment/helm-chart/).*\.ya?ml$ # Only check non-helm YAML files
exclude: ^deployment/helm-chart/.*\.ya?ml$ # Explicitly exclude helm directory
- id: check-toml
- id: check-json
exclude: ^.vscode/launch.json$
- id: check-ast
- id: end-of-file-fixer
- id: trailing-whitespace
- id: check-added-large-files
- id: debug-statements
- id: check-case-conflict
- id: check-docstring-first
- id: detect-private-key
- id: check-merge-conflict
- id: mixed-line-ending
args: [--fix=auto]
- id: requirements-txt-fixer
- id: name-tests-test
args: ['--pytest-test-first']
- id: check-executables-have-shebangs
‍What else is useful with pre-commit (keep it lean):
That's the gist: pre-commit turns "please run the linters" into "you can't forget."
Provides consistent conventions for naming, docstrings, imports, and code structure. Useful for maintaining readable code across teams and projects.
Analyzes type hints to catch type-related errors before runtime. Helps document function signatures and prevents common bugs like passing wrong data types.
Python testing frameworks with different approaches to writing and organizing tests.
Automates testing across multiple Python versions and dependency combinations in isolated environments. Particularly useful for library authors ensuring compatibility.
Cross-platform development challenges arise from different package managers, file systems, and system libraries. Consider using Unix like environments (Any Linux distro, Ubuntu being preferred and MacOs) for developer friendly setup and avoiding the pain of compatibility issues. For folks using Windows, I very much suggest WSL2 as an appealing option. Several popular tools like even Tensorflow, Docker straight up suggest WSL2 on their installation guide.
Containers solve dependency conflicts and deployment consistency by packaging applications with their complete runtime environment.
Virtual machines provide clean, disposable environments for testing deployment scenarios and experimenting with system-level changes.
Development containers provide reproducible environments where the entire team works with identical toolchains and dependencies.
Tools that create isolated development environments without the overhead of full containerization.
SSH-based development allows working on powerful remote machines while keeping your local environment lightweight.
IDE shortcuts reduce context switching and improve development speed. Focus on file navigation, multi-cursor editing, and pane management.
Consistent formatting and behavior across team members and different editors.
Structured debugging with breakpoints, variable inspection, and step-through execution instead of print debugging.
Visual customization to reduce eye strain and improve readability during extended coding sessions.
Enhanced shells provide better completion, history, and prompt customization with contextual information like git status and virtual environments.
Command-line fuzzy search tools for quickly finding files, commands, and navigating large codebases.
Enhanced Python shells that improve the development experience with better completion, history, and debugging capabilities.
IPython is particularly valuable for exploratory development, data analysis, and testing code snippets interactively. Its magic commands (%timeit, %debug, %run) and seamless integration with Jupyter notebooks make it essential for iterative development workflows.
Linux is de-facto standard for servers and cloud compute environments. For instance, just the command ps aux | grep python is incredibly handy and useful for troubleshooting. Proficiency in Linux basics and cli tools would help a lot.
Become proficient in logging as it is required in any non-local setups.
Tools for identifying performance bottlenecks and understanding where applications spend execution time.
Production error tracking and application performance monitoring for identifying issues before they impact users.
This toolkit reflects my current preferences after navigating Python's ecosystem for the past two years. What I've learned is that the specific tools matter less than having consistent, automatable practices that scale with your team and projects.
Important caveats: This setup works best for greenfield projects where you can establish good practices from day one. Legacy codebases require a more gradual migration approach—introducing these tools incrementally rather than wholesale replacement. Each codebase has its own context, constraints, and team dynamics that influence what "good DX" actually looks like.
The Python ecosystem will continue evolving. UV might get superseded by something better, new linters will emerge, and development practices will shift. What won't change is the underlying principle: invest in tooling that reduces friction, catches problems early, and lets you focus on solving actual business problems rather than fighting your development environment.
My advice? Pick a few tools that solve your biggest pain points and gradually build up your toolkit. The hours spent tinkering with Linux distros, reading through dense documentation, and setting up development environments might feel unproductive, but they build the troubleshooting instincts and system understanding that make you a more capable developer. Embrace the exploration—your future self (and your teammates) will thank you for it.
The goal isn't to use every tool mentioned here, but to thoughtfully curate a development environment that makes you more productive and your code more reliable. Start with the basics, experiment with what interests you, and build your own opinionated toolkit over time.
The Python ecosystem has grown to serve a vast variety of users and domains, becoming the de-facto glue language in software development. This versatility has also made it the preferred beginner language.
When it comes to backend development, choosing the right programming language is crucial for building robust and efficient applications. While Python has long been a popular choice, Golang (Go) has emerged as a compelling alternative, offering several advantages that make it a preferred language for certain use cases.
Golang, also known simply as Go, is an open-source programming language created by Google. Developed by Robert Griesemer, Rob Pike, and Ken Thompson, Go was designed to address the challenges faced by developers in building large-scale software systems. Go combines the simplicity and readability of a high-level language with the performance and efficiency of a lower-level language. It aims to provide a productive and scalable development experience while maintaining a focus on simplicity, reliability, and ease of use.
Golang incorporates several features and design principles that contribute to its effectiveness in building efficient and scalable applications making it ideal for commercial use even, by software development companies. Here are some key characteristics of Golang:
1. Simplicity and Readability:
Golang’s syntax is intentionally designed to be simple and readable, emphasizing code clarity and reducing cognitive load. Its concise and expressive syntax allows developers to write clean and understandable code, making it easier to maintain and collaborate on projects.
2. Static Typing and Compilation:
Golang is a statically typed language, meaning that variable types are checked at compile-time. This early type checking catches errors before the code runs, improving code reliability and reducing the risk of runtime crashes. Additionally, Golang’s compilation process produces standalone executable files, eliminating the need for end-users to install a separate runtime environment.
3. Concurrency and Goroutines:
Golang’s built-in support for concurrency is a significant advantage over many other programming languages. Goroutines, lightweight threads of execution, enable concurrent processing of tasks. By leveraging goroutines and channels for communication and synchronization, developers can easily build highly concurrent systems that efficiently utilize multiple cores.
4. Garbage Collection:
Golang incorporates automatic memory management through a garbage collector. The garbage collector manages memory allocation and deallocation, relieving developers from manual memory management tasks. This simplifies memory management and reduces the risk of memory leaks and dangling pointers.
5. Strong Standard Library:
Golang comes with a comprehensive standard library that provides a wide range of functionalities for common programming tasks. The standard library includes packages for networking, encryption, file handling, testing, and more. This rich set of libraries reduces the need for external dependencies, streamlines the development process, and promotes code maintainability.
Now that we have a better understanding of what Golang is and its key features, let’s delve into the pros of using Golang over Python.
1. Performance and Efficiency:
Golang’s compilation of machine code and lightweight runtime contribute to its excellent performance and efficiency. Its execution speed surpasses interpreted languages like Python, making it an optimal choice for computationally intensive tasks.
2. Concurrency and Parallelism:
Golang’s concurrency model and goroutines enable developers to build highly scalable and efficient concurrent systems. The language’s native support for concurrent programming simplifies the development of applications that can effectively utilize multiple cores.
3. Scalability and Concurrency Safety:
Golang’s design promotes scalability and concurrency safety, making it easier to build systems that handle large workloads. With features like goroutines and channels, developers can create highly concurrent applications that scale efficiently.
4. Cross-Platform Compatibility:
Golang supports cross-platform development, allowing applications to run seamlessly on different operating systems. Its toolchain facilitates the creation of binaries that work consistently across environments, simplifying deployment on various systems.
5. Ease of Deployment:
Golang’s compilation process produces standalone binaries that can be easily deployed without additional dependencies or runtime environments. This reduces deployment complexities and potential compatibility issues.
6. Community and Industry Support:
Golang enjoys a vibrant community of developers and receives support from major tech companies like Google. The community provides a wealth of resources, libraries, frameworks, and tools, fostering a collaborative environment for learning and growth.
While both Go (Golang) and Python are valuable programming languages, each excels in distinct areas, making them ideal for different scenarios. Go’s focus on performance, concurrency, and scalability positions it as a strong choice for systems-level programming, network services, and applications requiring high throughput and low latency. Its compiled nature ensures efficient execution and small binary sizes, enhancing the overall user experience.
Python, on the other hand, boasts a rich ecosystem, rapid development capabilities, and ease of learning. It shines in domains such as data analysis, scripting, web development, and machine learning, where quick prototyping and a wide range of libraries are crucial.
Golang Over Python hinges on the specific needs of the project. For performance-critical applications, concurrent systems, and network services, Go’s strengths make it a compelling choice. Python’s versatility and ease of use make it well-suited for scenarios where development speed and a robust library ecosystem take precedence.
Ultimately, the decision depends on the trade-offs between performance, development speed, and the unique requirements of your project. Whether you harness Go’s efficiency or Python’s versatility, both languages empower developers to create impactful solutions that drive innovation and meet the demands of modern software development. Golang Over Python still is a debatable topic among developers.
At Techjays we have Golang and Python experts who can help you to build highly scalable systems, following the best practices of the software developmentt industry. We consider Golang over Python and choose the tech stack based on the use case.
‍
When it comes to backend development, choosing the right programming language is crucial for building robust and efficient applications. While Python has long been a popular choice, Golang (Go) has emerged as a compelling alternative, offering several advantages that make it a preferred language for certain use cases.