You know FUNCTIONALLY Data has emotional responses. His programming obviously includes a system that generates personal preferences by value judgments, probably based on an initial partially random seed implanted by Dr. Soong at the moment of his activation. Whenever Data is required to make choices, his operating system gives him negative or positive feedback of some kind which allows him to generate preferences, and then subsequently reinforce those preferences. That in essence grants him pathos, even if technically it isn't moderated by biochemistry. When you see him in The Most Toys confronted with probably the most intense dilemmas he's ever faced, the situation is plainly causing him a state of mind equivalent to distress. It's a different kind of emotional distress than a human would subjectively experience but there's no functional distinction. It is hurtful for him to be forced to choose between a series of actions which are all strongly against his personality preferences. When pushed into such a scenario it causes instability not at all dissimilar to a human's irrationality, to the point that Data briefly became capable of lying, and killing. He also was shown to be capable of irrational behavior in pursuit of things he strongly preferred as well.
What happens in a sentient android's mind when he does something, when he achieves something, that his system's preferences describe as positive? What happens in his mind when there is a goal he becomes aware of that his personality matrix obliges him to pursue? Would he not require some kind of positive/negative reinforcement in his mechanical brain to actually make choices and act on those choices? Something must be happening there to create positive and negative reactions.... FEELINGS if you will. In other words, to have preferences and autonomous will to act, there has to be some kind of internal sensation that informs you what outcomes and actions are bad and good. Some species of sensation in his brain drives him and motivates him, and that motivation is fundamentally what emotion is all about. Otherwise Data would never do anything except what he is explicitly commanded/programmed to do, he would never generate or follow personal preferences, and his personality would be unchanging and utterly predictable. I think philosophically there's no difference between the sensations going on in Data's mind that drive his behavior and the sensations going on a human mind that drive our behavior. You could call both things "emotions".
If there existed an alien race that evolved on a different planet that had human level intelligence, they too would have alien subjective emotional worlds, but we wouldn't say they "lack emotions" as long as they have behavioral patterns that govern their choices and preferences by sensory input.
Just something I think about from time to time watching TNG. There's good reason the Federation and the crew of the Enterprise consider Data to be a living sentient being. All intelligent living things are controlled by extremely complex algorithms (complex to the point of being difficult or impossible to deterministically predict). Data is no different.