Why is Utilitarianism rubbish? We are supposed to do an action which will create a net increase of welfare - how are we to judge if a particular action will increase welfare? If we are forced to make this decision, do we not have to rely on some 'internal moral' or integrities that we might hold, therefore making it quite impossible to judge correctly what will increase the general happiness? And doesn't utilitarianism require us to act as machines, not bothering about what we feel?