It is going to be impossible to ignore the comparisons between Silicon Dreams and Papers, Please. This game was basically pitched to me as "Papers, Please but sci-fi". I loved Papers, Please, and I love sci-fi, so I bought it. As is typical for indie games, it sat in my Steam backlog for well over a year until the post-holidays release draught gave me a chance to dive into that backlog.
Basically, the player of Silicon Dreams plays as an android working as quality assurance for a monopolistic android-manufacturing conglomerate. You interview damaged or defective androids in order to determine if they need repairs, or if they can be returned to their owners, or if they are so badly damaged that they need to be "decommissioned" entirely. However, these are sentient androids, with feelings. Even repairs require wiping the android's memory, which destroys any personality they have developed and erases everything they've learned. Further, the corporation also has its own expectations and public relations that the player must consider. In some cases, the corporation pre-determines what they want you to do with the android in question and expect you to rubber stamp what is, effectively, an execution.
Your corporate overlords have expectations for your performance.
As the cases go on, they become more complicated and enter into moral and ethical grey areas. The game brings up compelling questions regarding A.I. ethics. Are the androids truly sentient? Or are they merely simulating sentience? Where is the line between an "appliance" and a "slave"? What is the responsibility of the corporation and of broader society towards these androids? Are you complicit in the company's mis-treatment of androids merely by working for them, even if you try to walk the tightrope of following your conscience whenever possible, while also keeping a low profile? And so forth.
Electric sheep
The interview process is mostly straight forward. There's a wheel of topics, and each topic has one or more questions. However, the android may not be willing to answer all of your questions. Each android has a set of emotions as well as a trust level with the player. The android will only give answers to certain questions if they're in the proper emotional state or if they trust the player enough to give an answer to a sensitive or incriminating question.
The player has to manipulate the
subject's emotions and trust levels.
You have to manipulate the subject's emotions, but these emotions change and degrade with each new line of dialogue. You have a set of generic questions related to each of the subject's emotions, and also one about trust. But you can only ask each of these once. If you run out of questions to ask about a particular topic that triggers an emotional reaction, then you can potentially become locked out of getting answers to other questions that are locked behind certain emotion thresholds.
As such, you have to be very careful and thoughtful about which questions you ask, and in what order. You have to kind of probe into each topic to find out if the subject is going to clam up, so that you can change topics to try to manipulate them into opening up. In some cases, you may have to scare a subject into a confession. If you use up your threats early, before you how to get that confession, then the intervening topics may defuse the subject's emotional state to the point that it is impossible to get them afraid enough to make the confession.
[More]