Portfolio · 2026

Arjun
Shewalkar Computer Science · Software Engineer

Creating ideal systems — where technical precision meets something quietly alive.

React TypeScript Node.js Solidity Python
AS

Building interfaces that feel like they were made by hand, not generated by machine.

Selected Work
01 / Serene

Serene / Inner Bloom

Mindful Meditation App
Stack TypeScript · Vite
Focus Calm UX · Cognitive Ease

I wanted to build something that feels like a deep breath.

Not another productivity app disguised as wellness, but an actual space where your mind can settle without being told what to do.

The interface strips away everything that doesn't serve stillness. No aggressive timers, no achievement badges, no guilt-inducing streaks. Just generous whitespace, gentle transitions, and typography that doesn't shout. Every interaction is designed to feel unhurried — like the app itself is breathing with you.

View Website
Design Principles
  • Visual rhythm that mirrors natural breathing
  • Typography deliberately oversized for easy reading
  • Palette pulled from early morning and late evening light
  • Motion that suggests settling, not hurrying
02 / Cockpit

Cockpit Console

Web3 Interface Demo
Stack TypeScript · Vite
Focus Information Architecture

Could I build a Web3 interface that doesn't feel like it's trying to sell you something?

A lot of crypto products lean heavily on flashy gradients and animated charts — always disconnected from the fact that these tools deal with real value and real decisions.

I thought about it like a driver's cockpit. Everything has a place you can rely on. You shouldn't have to hunt for information or guess what a control does. Color is used sparingly — only to show when something actually changes.

View Website
Architecture
  • Component library with consistent visual grammar
  • Real-time state management without visual chaos
  • Responsive grid that maintains information density
  • Accessibility-first interaction patterns
03 / AskMyNotes

AskMyNotes

AI Study Copilot · Hackathon
Stack React · Python · Claude API · Web Speech
Focus RAG · Voice Interaction · Grounded AI

Built in 8 hours. A study copilot that answers only from your own notes — never invents.

You upload notes across up to three subjects. Select one, ask a question, and the system answers strictly from that subject's material — citing the exact file, section, and confidence level. If the answer isn't in your notes, it refuses outright: "Not found in your notes for [Subject]." No hallucination, no guessing.

Phase 2 extended this into a voice-first teacher experience. Ask questions by speaking, hear answers read back, and maintain multi-turn context — follow up with "give an example" or "simplify it" and the system remembers the thread. All Phase 1 constraints remained fully intact: subject scoping, citations, confidence levels, and strict refusal behaviour.

Features
  • PDF/TXT upload across three scoped subjects
  • Grounded Q&A with citations + evidence snippets
  • Confidence levels — High / Medium / Low
  • Strict "not found" refusal, zero fabrication
  • Voice input & spoken responses via Web Speech API
  • Multi-turn context for natural follow-up questions
  • Study Mode: 5 MCQs + 3 short-answer questions generated per subject
Built In
  • Phase 1 — 4 hours: core Q&A, citations, refusal logic
  • Phase 2 — 4 hours: voice layer + multi-turn conversation