Many of the ubiquitous algorithmic systems permeating society have come under scrutiny due to their lack of accountability. As algorithmic decision making increasingly affects our lives, calls to improve the transparency of these systems are met with social, legal and technical limitations that challenge whether transparency alone is the solution to algorithmic accountability. In my dissertation, I explore the role of algorithmic tranparency, algorithmic literacy and related issues as approaches towards holding algorithmic systems more accountable. Bridging HCI and STS communities, my work is grounded in a critical ethnography of algorithmic systems and their impact on its stakeholders. Through this approach, I aim to provide both theoretical insights and material solutions to the problem of accountability. By unpacking the complex socio-technical assemblage that make up these systems and employing both participatory and user-centred design principles, my goal is to co-design measures that support sense-making of algorithmic processes and allow holding these systems accountable.