How This Started
I'm a network engineer. I spend a lot of time using online tools โ subnet calculators, BGP looking glasses, DMARC checkers, MTU calculators. Most of them are ugly, covered in ads, or require you to create an account to use a text field. Some are just wrong.
So I started building my own. One became five, five became fifteen, and at some point I looked up and had 38 of them. The tools page at tunaozcan.com/tools is one of the most visited parts of the site โ which tells me I wasn't alone in finding the existing options frustrating.
Here's how I built them and why I made the choices I made.
The Framework Decision (Or: Why I Didn't Use One)
The honest answer is that I considered it and decided against it pretty quickly. The tools are computationally simple. A subnet calculator takes an IP and a prefix, runs some bitwise operations, and displays results. A DMARC checker makes a DNS query and parses a string. None of this requires state management, component trees, or a virtual DOM.
React, Vue, or Svelte would have added a build step, a node_modules directory, a bundler to configure, and a deployment pipeline to maintain โ for what is ultimately a collection of HTML forms with JavaScript attached. The overhead didn't justify the benefits.
Vanilla JS also meant that anyone could open the browser devtools and read exactly what the tool does. No minified bundles, no source maps to chase. For tools that engineers will use to make real networking decisions, I wanted that transparency to be available.
The tradeoff is that some things are more verbose. There's no reactive binding โ when state changes, I update the DOM manually. For 38 single-page tools, that's fine. For a complex application, it would become painful quickly.
The Architecture
The site is hosted on GoDaddy with Cloudflare in front for CDN, caching, and the Workers AI integration. No backend, no database, no server-side rendering โ everything runs in the browser.
The structure is flat and intentional:
/
โโโ index.html
โโโ tools.html โ hub page with 38 tool cards + filter pills
โโโ tools/
โ โโโ subnet-calculator.html
โ โโโ bgp-path-selector.html
โ โโโ dmarc-checker.html
โ โโโ ... (35 more)
โโโ articles/
โโโ js/
โ โโโ components.js โ nav, footer, TOC, schema injection
โ โโโ articles-data.js
โ โโโ main.js
โโโ css/
โโโ style.css
Each tool is a self-contained HTML file. No shared state between tools, no routing library, no module imports. The tools hub uses hash-based filtering โ the URL tools.html#email triggers the email tools filter automatically via a small function that reads window.location.hash on load.
The Tools That Were Harder Than Expected
BGP Path Selection Visualizer was the most complex to get right. BGP has 13 tiebreaker attributes in a specific order, and the logic for walking through them โ finding the winning path and explaining why โ required building a proper decision engine, not just a form. The tricky part was making the explanation useful rather than just outputting "Path A wins because Local Preference is higher." I wanted it to show the full comparison across all attributes and mark exactly where the decision was made.
VLSM Calculator required a proper subnet allocation algorithm. You give it a list of host requirements and it finds the optimal allocation with no overlap. The naive approach (sort by size, allocate sequentially) works โ but I added binary tree visualization to show how the address space gets divided, which meant building a simple tree renderer in canvas.
Config Diff Tool was deceptively simple until I implemented it properly. The naive diff (line by line) produces terrible output for network configs where indent changes or reordering moves large blocks. I ended up implementing the LCS (Longest Common Subsequence) algorithm properly, which gives output that actually makes sense when comparing a before/after maintenance window config.
Email Header Analyzer surprised me. Parsing raw email headers sounds straightforward โ split on newlines, extract values. But headers fold across lines, authentication results chain across multiple headers, and the timing analysis (measuring delay at each hop) requires careful timestamp parsing with timezone handling. That one took longer than any of the networking tools.
What All the Tools Share
A few patterns I used consistently across all 38:
Everything runs locally. No data leaves the browser. The password hasher doesn't send your passwords anywhere. The encoder doesn't upload your content. The email header analyzer doesn't log headers to a server. I'm explicit about this in each tool's UI because engineers rightly ask the question.
Keyboard-friendly inputs. Pressing Enter in a text field submits the form. Tab order is logical. Results are immediately visible without scrolling. These sound like small things but they're the difference between a tool that fits into a workflow and one that doesn't.
Copy on click. Any command, config snippet, or result that you'd want to paste somewhere has a copy button. No selecting text, no right-click, no keyboard shortcut. Click, paste, done.
No ads, no login, no tracking on the tools themselves. The site has Google Analytics on the main pages. The tools don't phone home. This was a deliberate choice โ I wanted them to be something I'd actually use, and I don't use tools that feel like they're watching me.
The Cloudflare Workers AI Integration
The AI chat feature on the site (tunaozcan.com/chat) runs on Cloudflare Workers AI with Llama 3.1 8B. This was the most interesting piece of infrastructure to build because it required the only server-side component on the entire site.
The worker handles the API call to Cloudflare's AI gateway, adds a system prompt that primes the model for networking questions, and streams the response back to the browser. The total cost is zero โ Cloudflare's free tier covers the inference load at current traffic levels. It's a genuinely good free alternative to running your own API key through Anthropic or OpenAI.
The limitation is model quality. Llama 3.1 8B is capable for general questions but struggles with very specific networking syntax or rare edge cases. I'm transparent about this on the chat page โ it's a useful starting point for questions, not a definitive source.
What I'd Do Differently
A few things I'd approach differently if starting over:
I'd build a shared utility library earlier. Right now there are small functions โ IP validation, binary conversion, subnet math โ duplicated across multiple tools. They're identical but unlinked. That wasn't a problem for 5 tools but at 38 it's annoying to maintain.
I'd be more aggressive about progressive enhancement from the start. Most tools work fine with JavaScript disabled (the HTML form still renders), but a few are fully blank without JS. That's a miss.
I'd also add deep-linking to specific results earlier. If you run a DMARC check and want to share the result with a colleague, right now you can only share the tool โ not the specific analysis. A simple URL parameter scheme would have fixed this.
The Tools
All 38 are free, browser-only, and available at tunaozcan.com/tools. They cover IP and subnetting, BGP and routing, email authentication (SPF, DMARC, DKIM, blacklist checking), network configuration, photography calculators, and general dev utilities. If you find one useful, the only ask is to share it with someone who'd benefit.
If you notice a bug or have a tool you'd like to see added, the contact is on the about page.