Preloader
Binokular Contact Us
Testimoni

Algorithm Decides Livelihoods: When Systems Determine Job Opportunities in the Platform Era

In the digital platform era, livelihoods are no longer determined entirely by human effort. In many cases, job opportunities are distributed by algorithms. These systems decide who gets orders, when orders come in, how often a driver or courier appears in the app, and how likely they are to earn income.

The phrase “algorithms decide livelihoods” may sound harsh, but that is exactly the problem becoming more visible in today’s digital economy. In the gig economy, especially in ride-hailing and delivery services, platform workers’ income is not determined only by working hours or discipline. It is also shaped by the logic of systems that set priorities, read performance, and manage demand distribution in real time.

When the manager becomes a system

The biggest change here is not simply the presence of technology, but the transfer of managerial functions into digital systems. The OECD defines algorithmic management as the use of technological tools to automate or support managerial tasks. In a survey across six countries, the OECD also underlined that these systems affect how work is assigned, monitored, and evaluated.

In other words, algorithms are no longer just supporting tools. They have become part of how modern organizations operate.

For platform workers, that role feels very real. Algorithms do not only match drivers with passengers or couriers with deliveries. They also set the rhythm of work, influence bonuses, read responses, assess performance, and in certain cases become the basis for sanctions. Companies may see this as efficiency. Platform workers often experience it as a form of control they do not fully understand.

At this point, the question is no longer whether technology helps, but who benefits from the logic of the system. When algorithms decide livelihoods, the employment relationship shifts into a relationship with a decision-making machine that is not physically present, yet has very concrete effects on everyday life.

Flexibility that looks free, but is not fully liberating

The most common narrative attached to platform work is flexibility. Workers are described as being free to choose when to go online and when to log off. But in practice, flexibility is not always the same as freedom. When the system gives more weight to certain hours, when ratings influence access to orders, and when worker responses are constantly read, platform workers are actually adjusting themselves to a rhythm that has already been set by the system.

Research highlighted by CfDS UGM in September 2025 showed this clearly. Drivers were found to work for long hours just to remain “visible” to the algorithm. The study also pointed to anxiety over app visibility, fear of going offline, and income insecurity caused by a system that cannot always be predicted.

This matters because the issue is not only about declining orders. It is also about psychological pressure when access to work is determined by mechanisms that are not transparent.

This is where the phrase “algorithms decide livelihoods” becomes relevant. Livelihoods are no longer tied only to individual effort. They also depend on whether the system is giving someone priority, reading certain behavior as strong performance, or reducing visibility without adequate explanation.

From fare disputes to questions of system fairness

Debates about platform workers in Indonesia used to focus mostly on fares, commission cuts, and incentives. But the issue has moved further. During ride-hailing driver protests in July 2025, one of the demands that emerged was opposition to schemes that were seen as prioritizing orders through app algorithms for those who paid certain fees.

This debate shows that public criticism is no longer stopping at commission figures. It is now reaching the logic of work distribution itself.

This shift matters. It means the public is beginning to understand that the problem facing platform workers is not only about final income, but about how access to that income is structured. When algorithms decide livelihoods, questions of fairness cannot be answered simply by saying, “the system is automated.” What people are actually questioning is how that automation takes sides, evaluates, and ranks human beings.

Transparency is not enough if the system still cannot be understood

Some platforms appear to be realizing that legitimacy cannot be built through efficiency claims alone. Gojek, for example, emphasizes the phrase “Transparent, Fair, & Comfortable” in its updated Code of Conduct. This shows that even on the company side, there is growing recognition of the need for rules that can be better explained.

But formal transparency does not automatically solve the problem if the logic behind order distribution, performance evaluation, and system prioritization remains unclear to the workers who rely on it every day.

This distinction often gets overlooked. There is a difference between a system that has written rules and a system that can truly be understood by the people being judged by it. In platform work, that gap remains wide. Workers often know the consequences of the system, but do not really know the basis of the decisions that allow them to receive or lose job opportunities.

A similar logic can be seen in Grab. In its official communications to partners, Grab introduced the Order Planner menu with the promise of helping workers “optimize orders and earnings.” It includes features such as Scheduled Bookings, which are claimed to provide “guaranteed orders” with higher earning potential than regular orders, as well as Preferred Slots for entering areas with busier order opportunities.

From the platform’s perspective, this is an optimization tool. From the worker’s perspective, it shows that access to orders is increasingly determined by system design, not merely by physical proximity or years of service.

Criticism of models like this does not come only from workers. The Fairwork Indonesia Report 2025 explicitly recommended removing discriminatory priority schemes such as Aceng, Slot, and Grab Hemat, because they are seen as limiting fair access to job opportunities and deepening inequality among workers. The report also stressed that workers in transportation and delivery face algorithms that control access to work with low levels of transparency.

Similar logic in e-commerce

A similar pattern can also be seen in e-commerce, even if the form is different. In Tokopedia’s official educational materials, TopAds is described as a promotional feature that helps products or stores appear on strategic pages, broaden their reach to prospective buyers, and place products in more visible positions. Other Tokopedia materials also explain that store ads can appear at the top of buyer search results.

This means that in marketplaces, visibility is not determined only by product quality or price. It is also influenced by how the system manages exposure. For digital sellers, this logic means economic opportunity is increasingly tied to the ability to read and adapt to the architecture of the platform.

This is not identical to the case of ride-hailing or delivery workers, but the pattern is similar: digital systems help determine who is seen more, who is found faster, and who gets greater opportunity.

That is why the phrase “algorithms decide livelihoods” is relevant not only to drivers and couriers. It is also relevant to online sellers, creators, freelancers, and other digital economy actors whose lives increasingly depend on visibility and prioritization determined by systems.

Policy is beginning to move

What is interesting is that this issue is no longer confined to academic debate or worker complaints. The European Union has already passed Directive (EU) 2024/2831 on improving working conditions in platform work. The regulation explicitly addresses data protection, automated decision-making, and the need for clearer information for people working under digital systems.

The direction is clear: if algorithms affect working conditions, their use cannot remain entirely opaque. The Directive entered into force 20 days after its official publication, and member states are required to align their national rules no later than 2 December 2026.

The ILO has also placed decent work in the platform economy on the official agenda of the International Labour Conference in 2025 and 2026. That signals one simple thing: platform work is no longer seen as a fringe phenomenon of the digital economy, but as a global labor issue that needs more serious regulation.

Not anti-technology, but anti-opacity

For that reason, criticism of algorithms should not be read as anti-technology romanticism. Digital platforms do need automation. At the scale of millions of transactions, it would be impossible for every decision to be made manually by humans.

The real issue is not whether algorithms may be used, but how they are used, in whose interests, and with what kind of accountability.

If algorithms decide livelihoods, then platform workers have the right to know what parameters are being used to assess them. If visibility can drop, orders can change, and bonuses can be delayed because of the system, then there must be a reasonable path for explanation. If worker behavior data is continuously monitored, then there must be clear limits on what data is being collected, for what purpose, and how it affects their job opportunities.

At this point, the algorithm issue is really an issue of power. The language of technology often makes it sound neutral, as if every decision simply comes from objective calculation. But every system is built on assumptions, priorities, and interests. When platforms determine what counts as good performance, who gets prioritized, and what kind of work pattern is considered ideal, that is where the politics of digital labor is taking place.

Why this issue deserves wider attention

For the public, this issue deserves to be read beyond the world of ride-hailing drivers and couriers. The same logic is operating across many other digital sectors. Creators chase visibility, sellers chase rankings, freelancers chase system responses, and all of it is shaped by rules that are not always fully disclosed.

In that sense, “algorithms decide livelihoods” is not only a platform worker issue. It is a broader picture of how economic opportunity is increasingly being determined by systems.

This is where media monitoring, social media monitoring, and digital research become important. When issues like this enter the public sphere, what is at stake is not only platform reputation, but also trust in the systems these companies build.

The reading of media coverage, public reaction, and evolving narratives becomes key to understanding whether society still sees platforms as enablers, or is beginning to see them as a new power structure in the digital economy.

In the end, the biggest question is not only whether algorithms are efficient. The more important question is this: when algorithms decide livelihoods, who governs the algorithms, and who is accountable if those systems help shape the lives of millions?

As long as the answer to that question remains unclear, algorithms can no longer be read as neutral technology. They must be read as part of the broader issue of labor justice in the digital era.

Contributor

Other Analysis

Controversy Over Suckermouth Catfish in Jakarta: Ecological Urgency and MUI’s Criticism of the Method

Growing public attention toward environmental issues has placed the Jakarta Provincial Government’s action to clear suckermouth catfish from several rivers…

Crisis Communication Plan Geopolitics: Lessons from Indonesian Companies Affected by US–Iran Tensions

After discussing the geopolitical playbook in the previous article, the question now becomes more concrete: how have companies responded and…

Gen Alpha Has Already Changed Consumer Behavior. Are Brands Reading the Signals Yet?

Gen Alpha Is No Longer a Future Market For a long time, Gen Alpha has been positioned as a future…

Public Sentiment Heats Up, Netizens Urge Police to Fully Investigate the Andrie Yunus Case

The acid attack that targeted Andrie Yunus, an activist from the Commission for the Disappeared and Victims of Violence, triggered…

WFH Once a Week: Energy Efficiency vs Economic Dynamics

Almost exactly one month after the Iran–Israel war erupted, the world has begun to feel one of its consequences: the…

Holiday Bonus Tax:Between State Rules, Social Justice, and the Sensitivity of a Religious Moment

During the period from February 23 to March 12, 2026, Indonesia’s media landscape and public sphere were animated by one…

Geopolitical Crisis Communication Plan: A Playbook for Multinational Companies

When geopolitical tensions surge—such as in a scenario of US–Iran escalation—business risks often appear long before operational disruptions become visible….

Sudewo Pati Regent Case Under Media Spotlight: Communication Analysis, Framing, and Public Perception

In recent times, public attention has again turned to leadership dynamics at the local level. Various cases involving regional heads—such…

Let Me Be the Only Indonesian Citizen”: The LPDP Alumni Controversy and Public Reaction on Social Media

A topic went viral on February 21, involving alumni of Indonesia’s Education Endowment Fund (LPDP). Dwi Sasetyaningtyas and her husband,…

SEAblings: A Unique Show of Southeast Asian Netizen Solidarity

Lately, the term SEAblings has been widely discussed across social media. The phenomenon gained traction after tensions between Southeast Asian…