I write DataWeave, design MuleSoft integration architecture, and build desktop and serverless tools when the existing options aren't good enough.
I'm an Integration Engineer at a housing finance NBFC regulated by the RBI, and the primary MuleSoft developer on the team. I own 40+ APIs and 720+ endpoints across a four-server on-prem production environment — system APIs, process APIs, experience APIs — serving 1 lakh+ daily transactions.
My day job is designing integration flows, writing DataWeave, managing Salesforce connectivity, handling VAPT remediation, and keeping production stable while shipping. Outside of work, I build developer tools and open-source projects that solve problems I actually run into.
A local desktop application for writing, testing, and debugging DataWeave 2.0 scripts. No Anypoint Studio. No Eclipse. No 2-minute startup time. Full Mule message context from a UI, in one window.
Testing a DataWeave script in the standard MuleSoft toolchain is genuinely painful. Anypoint Studio is 2GB, Eclipse-based, takes minutes to start, and forces you to spin up a full Mule application just to test a single transform.
The online playground has no real execution context. No vars, no config properties, no headers. It crashes on large payloads. The VSCode extension requires hand-written JSON files and a specific folder structure for every new input.
None of these tools reflect what actually runs in production — where a script
might depend on vars.correlationId, a decrypted secure property,
or a specific content-type header to behave correctly.
DataWeave Studio bundles the DataWeave CLI, wraps it in a Monaco-powered editor,
and exposes the full Mule message context through a UI. You set
attributes.method, headers, vars, config properties,
and secure values — all without touching a file.
It ships as a native installer for Windows, macOS Intel, macOS ARM,
and Linux. GitHub Actions builds every release. Cloudflare Workers proxy the
download assets — so the in-app updater works even behind corporate firewalls that
block github.com.
Real users. Community traction in the MuleSoft ecosystem. A speaking invite that came directly off the back of the project.
secure-config.yaml containing
![Base64Encrypted...] values, provide the AES-CBC encryption key,
and scripts execute with real decrypted values. The key lives only in memory for
the duration of the session — never written to disk, never logged.
Same algorithm as MuleSoft's secure-properties-tool.jar.
attributes.method, attributes.headers,
queryParams, vars, config properties, and named
input streams — all from form fields. No hand-written JSON files.
:param binding. Renders the exact final query
before execution.
.dwstudio file. Open it, pick up where you left off.
update.json on startup and installs in one click.
GitHub never touched directly — all traffic through Cloudflare.
v* tag triggers a build matrix: Windows, macOS Intel, macOS ARM, Linux.
Produces native installers automatically.
update.json
endpoint that the in-app updater polls on startup.
*.github.com.
Real-time, ephemeral text and file transfer between any devices with a browser. No accounts. No persistence beyond the session. End-to-end encrypted when you need it — the server genuinely cannot read your content.
postgres_changes DELETE event fires across all connected clients
instantly — ROOM VAPORIZED on every screen.
A serverless, multi-tenant backend that turns any CSV of postal codes into a private, queryable REST API. Upload the CSV. Get a live API back in seconds. No server provisioning, no database setup, nobody else's data mixed with yours.
datasetId.
The CSV never passes through API Gateway.
ObjectCreated event. The processor Lambda
parses the CSV, validates schema, and batch-writes to DynamoDB.
GET /dataset/{"{id}"}/status → SUCCEEDED.
Then query GET /postal-code/search?pincode=110001
and get your own data back instantly.
Built, deployed live on AWS, listed on the RapidAPI marketplace, and taken down when the free tier expired. The entire production source — infrastructure-as-code, all Lambda handlers, test suite — is open on GitHub.
The interesting parts aren't the domain. Pincodes are just lookup data. The interesting parts are the multi-tenancy design, the JIT provisioning pattern, and building end-to-end operational visibility on a fully serverless stack.
internalUserId.
Three GSIs cover three access patterns. Cross-tenant isolation is enforced
structurally at the query — not by application-level filtering that could be bypassed.
/pincode-api/enabled.
Set it to false and the authorizer denies every request globally,
instantly — no deployment, no code change.
customData map in DynamoDB and returned as-is. A logistics company
uploads pincode,sla,risk,is_serviceable — no schema pre-registration.
LatencyMs, RequestsOk,
RowsIngested via aws-embedded-metrics. X-Ray distributed tracing
on all functions. SQS DLQ depth alarm wired to SNS.
def handler(event, context):
# Every request: validate → lookup → provision/sync
_validate_proxy_secret(event['authorizationToken'])
marketplace_id = _extract_marketplace_id(event)
current_tier = _get_tier_from_headers(event)
user = users_table.get_item(
Key={'marketplaceId': marketplace_id}
).get('Item')
if not user:
# First request — create user on the spot
user = _provision_user(marketplace_id, current_tier)
elif user['tier'] != current_tier:
# Subscription changed — sync immediately
user = _sync_tier(user, new_tier=current_tier)
# Downstream Lambdas receive userId + tier in context
return _build_allow_policy(
principal=user['internalUserId'],
context={
'userId': user['internalUserId'],
'tier': user['tier'],
}
)
def is_api_enabled(ssm) -> bool:
"""
One SSM parameter. Instant global shutoff.
No deployment. No code change.
"""
param = ssm.get_parameter(
Name='/pincode-api/enabled'
)
return param['Parameter']['Value'] == 'true'
def query_pincodes(table, user_id: str,
pincode: str) -> list:
"""
Single-table multi-tenancy.
All tenants share one table — partition key
is internalUserId. Cross-tenant leakage is
impossible at the DynamoDB query level.
"""
return table.query(
KeyConditionExpression=(
Key('internalUserId').eq(user_id) &
Key('pincode_officename').begins_with(
pincode
)
)
)['Items']
| Tier | Max File | Max Rows | Max Datasets | Dynamic Schema |
|---|---|---|---|---|
| FREE | 50 KB | 250 | 1 | No |
| BUSINESS | 10 MB | 1,000,000 | 50 | No |
| GROWTH | 20 MB | 5,000,000 | 200 | Yes |
| ENTERPRISE | 50 MB | 25,000,000 | 1,000 | Yes |
A Windows desktop app for preparing codebases to paste into an LLM's context window. Drop a folder, configure what to exclude, hit one button — you get a clean structured dump with a file tree, ready for Claude, GPT, or anything else.
compute().
The UI never freezes, even on large codebases. Disk writes use a
streaming chunk-by-chunk approach — massive outputs don't blow up RAM.
.gitignore and respects it during scans.
Exclusion settings are stored in a .exclusion_settings.json
file at the project root — not locked to the machine or the app install.
node_modules, build, .git,
binaries, and type-specific artifacts excluded by default.
fluent_ui and flutter_acrylic.
Native Windows Mica and Acrylic glass effects. Dark, light, and
pitch-black themes with multiple accent color flavors.
AppState (ChangeNotifier) manages UI state.
SettingsService handles persistence.
FileService does all heavy I/O in isolates.
Pages are thin UI shells.