RES Part II
- Liz Alvarez
- Mar 12
- 6 min read
A Minor Algorithmic Network
Like any company, Holiday cards and sending clients gifts are things we never second guess, let alone think it’s more than sending best wishes. Every year, The Russian Empire treated the holiday season like a well-rehearsed performance. It was an “all hands on Deck” situation. From the outside, it seemed festive—gifting strategies, custom holiday cards, perfectly wrapped packages. Things has to be “perfect” or so it seemed.
When I was brought in to assist, I assumed my role was straightforward: help curate holiday gifts, refine client lists, and ensure the aesthetic matched the company’s image. But every time I suggested ideas—gifts to align with client tiers, card designs that fit the holiday spirit—they never quite “fit.” It was never about what made sense. It wasn’t just about the branding or the clients. It was deeper than that, as if the ritual itself was the priority, not the actual outreach.
The CEO and a “board member”—one not even officially on payroll but wielding heavy influence—would always step in and influence the process. They chose peculiar motifs that seemed off for the holiday card, a bicycle on a winter card instead of ice skates or sleds; a seaport theme tied to UNICEF donations the next year.
At the time, I didn’t question it. I should have.
My card suggestion in 2019

2019 Roll Out Design

keywords: 37,bicycle, 3, orange,snowflake, bench
Learn the code of the dark web and covert system where children are monitored, categorized, and potentially trafficked based on digital messaging.
The "Boards" 2020 Card Choice


Decoding the trafficking model
PORT = Entry/exit points for moving victims.
BUBBLE = A hidden system keeping victims isolated (physically, digitally, or economically).
UNICEF = A cover or pretense that makes exploitation look like humanitarian aid.
Companies leveraging charitable partnerships for reputation-building isn’t new. But what happens when the “giving” represents something else entirely?
UNICEF has been at the center of multiple financial redirection scandals, with accusations of donor funds being funneled into administrative black holes rather than aid. (Source: Financial Transparency Coalition, 2023 Report)
This aligns with what I later discovered. They wasn’t just inefficient with money—it was moving it with intent.
Their payment systems operated through layered redirections. It wasn’t until 2019 after the client gifts were approved—and I was already responsible for purchasing, packaging, and shipping them—that I realized a critical detail had been overlooked. No one from management had provided access to the company card to facilitate the transaction. As the deadline loomed, I asked for the card but received no response. With no other option and time running out, I was faced with charging the $10,000 expense to my own credit card. Reimbursement didn’t come immediately.

Case studies show that similar financial blurring can be a red flag for deeper issues. The WeWork scandal revealed how executives mixed personal and company funds, leading to questionable reimbursements and accounting irregularities. In another case, the collapse of Wirecard in 2019 exposed how funds were shuffled between accounts to disguise financial instability—sometimes through employees being used as intermediaries. Even in the Wells Fargo unauthorized accounts scandal, financial mismanagement and lack of oversight led to employees being pressured into actions that ultimately benefited the company, not them. When companies push large, personal reimbursements onto employees, it not only disrupts personal finances but can also indicate efforts to manipulate cash flow, obscure tax liabilities, or even engage in financial misconduct behind the scenes.
But The Russian Empire operation went beyond simple financial fraud. They weren’t just trafficing, laundering they were also monitoring. Even with something as small like a Google home mini.

Money and gifting monitoring devices wasn’t the only tactics—so were opportunities. If you control digital infrastructure, you control who moves forward and who doesn’t.
Tactics Used:
Algorithmic Suppression – Financial applications “mysteriously” rejected based on silent blacklists
Payment Manipulation – Transaction failures leading to overdrafts, late fees, and financial penalties. For example your credit card declined when you have funds in your account of fee redirects.
Search & Hiring Blackouts – Candidates quietly removed from job searches due to digital reputation tampering

We’ve already seen this happening at scale. A 2023 Harvard Business Review study confirmed that AI hiring filters reject applications before a human even sees them, often based on opaque, biased algorithms. (Source: HBR, 2023)
Amazon’s 2018 AI recruiting tool failure is a perfect case study. The system, designed to streamline hiring, ended up silently blacklisting candidates—especially women—by teaching itself to downgrade resumes that included female-coded terms. This wasn’t a glitch; it was a consequence of algorithmic bias controlling opportunity access like I experienced during The Big Apple.

Similarly, LexisNexis, a data analytics firm, was caught selling detailed personal financial and criminal history reports to law enforcement—without consent. This created a silent digital caste system where people could be denied housing, loans, or jobs based on incomplete or inaccurate records.
Its Financial Warfare at a Digital Scale Connecting the dots revealed an uncomfortable truth—The Russian Empire was built to extract, manipulate, and erase. Financial instability wasn’t a side effect—it was the goal. Imagine being charged an extra $2 on an $80 purchase. It’s subtle, nearly unnoticeable. Now, apply that across a million transactions. That’s an easy $2M siphoned without a trace. This wasn’t just fraud—it was a long-game operation where the true currency wasn’t money—it was control. The road towards Deadpool.
These “fees” get rerouted through third-party processing layers—slush funds masked as operational costs.
China’s Social Credit System (Ongoing) Citizens face digital blacklisting, limiting their ability to buy property, travel, or access loans based on algorithmic monitoring. Financial transactions, social behavior, and online activity determine one’s ability to function in society. The system ensures control through financial dependency.

Recognizing the Signs
Exploitation often doesn’t appear obvious until after the damage is done. Financial manipulation, workplace exploitation, and digital suppression are designed to be subtle, slipping past unnoticed until patterns emerge. A stark example is the NXIVM case, a so-called self-improvement organization that turned out to be a deeply exploitative cult. Many members joined believing they were part of a legitimate leadership training program, only to later realize they had been subjected to coercion, financial control, and psychological manipulation. The group's tactics included financial entrapment, blackmail, and an intricate system of loyalty-based abuse. Many victims only saw the full extent of the exploitation after years of investment, making it difficult to disentangle themselves. (Source: U.S. Department of Justice, 2020 NXIVM Trial Findings)
Based on my experience, here are the key signs to watch for:
1. Financial Exploitation Disguised as Workplace Responsibility
Unpaid reimbursements: Being asked to cover company expenses out of pocket, with delayed reimbursements.
Hidden financial redirections: Vague or convoluted expense processes, especially those that require personal accounts to act as intermediaries.
Manipulated cash flow: Companies pushing financial burdens onto employees to avoid tax liabilities or manipulate balance sheets.
2. Algorithmic & Digital Suppression
Silent blacklists: Job applications being rejected before human review due to opaque AI filters.
Payment disruptions: Sudden declines, unauthorized fees, and "errors" that disproportionately affect specific users.
Digital caste systems: Background checks and data tracking used to deny loans, housing, or employment without transparency.
How to Navigate These Systems
1. Protecting Your Financial Autonomy
Set Boundaries: Never front large company expenses from personal accounts without formal agreements.
Audit Transactions: Monitor for small, unexplained charges or redirected payments that could indicate manipulation.
Use Secure Financial Channels: Leverage personal financial security measures, such as virtual credit cards or business expense tracking apps, to prevent unauthorized charges.
2. Defending Against Algorithmic Bias
Control Your Digital Footprint: Regularly check and correct publicly available financial and employment records.
Leverage Multiple Platforms: Apply for jobs, loans, and services across different systems to avoid silent blacklisting.
Document Everything: Keep records of denials, fees, and system errors as proof in case of disputes.
Final Takeaways
Financial instability isn’t always a side effect—it can be the goal: Be aware of corporate structures that offload risk onto employees while maintaining power.
Control is the true currency: Whether through finances, employment, or reputation, those in power manipulate digital and economic systems to their advantage.
Your best defense is awareness and action: Recognizing these tactics before they escalate can protect you from falling into exploitative traps.
Comments