Member Validation and Data Update
Portal flows tied to validation and downstream data quality.
Member portal work combining LALIGA validation, forced data updates and normalization of inconsistent legacy records before they reached the DataLake.
Role
Full-stack Engineer / Integration Owner
Type
Client Project
Context
Context
This work sat in member portals where users had to validate themselves with idPersona and PIN, then update profile data before accessing services or purchase flows.
Problem
Problem and constraints
The data coming from legacy sources was inconsistent: postal codes and municipalities did not match, countries appeared in multiple formats and form fields contained years of unnormalized values.
Users were also getting blocked at login and update points because identity fields and profile data were often wrong at source, so the portal had to distinguish between invalid credentials, malformed input and recoverable data issues.
The flow had to validate against LALIGA APIs and push corrected data downstream to the DataLake, so bad input could not simply be accepted and cleaned later.
Approach
Approach and technical decisions
I designed login, registration and mandatory profile update flows with validation on both frontend and backend, including idPersona and PIN checks before the user could move deeper into the portal.
For normalization, I used explicit rules plus JSON mappings to verify postal code and municipality pairs, and added server-side validation, duplicate controls, indexes and database restrictions before persisting and forwarding data.
Challenges
Challenges
Outcome
Outcome
The portals stopped accepting inconsistent profile data as-is and started enforcing normalized updates before users continued.
That reduced blocked users caused by malformed identity and profile input, and lowered the amount of mismatched records reaching the DataLake and downstream operational flows.