Monthly Archives: November 2014

Narrative and compassion in management practice


For many years now I have been reminding myself of the reasons why my studies in English Literature have been so important to my own professional development. Through this self-reassurance I have constantly re-affirmed the concept that if one can understand the way a story or narrative in constructed, then one can understand better how the world itself is created. In this sense, it is interesting that Peter Brophy draws upon theories of narrative and story to inform ‘Evidence Based Library and Information Practice’ (ELIB). However, I do feel as though there is one key characteristic that is missing from his theory and that is of primary importance to storytelling. This year, I have read a lot of information and management theory relating to managing, teamwork, collaboration and leadership among others, but nowhere in that material have I encountered reflections on the importance of ‘compassion’ in both management and teamwork. It is only through compassion that a narrator can effectively create characters in stories, and, in organisations, having compassion is the only way one can understand and work with people’s own subjective and personal circumstances that they bring to work with them everyday, as well as understanding and accepting employees’ and customers’ limitations, while finding a way to work within them to achieve goals.

Brophy, drawing from Eldredge, outlines that evidence based learning and practice is both quantitative and qualitative but that there remains an imbalance in which emphasis is placed more on objective quantitative measures. However, he argues that this positivist approach does not apply well to librarianship because it involves a social system with variables that cannot be controlled within human interactions. Brophy shows an awareness of the prevalence of poststructuralism in contemporary social, cultural and linguistic theory: “To add to the complexity, all we have to describe the world is language, which itself introduces ambiguity, bias and difference.” Poststructuralism dictates that signs are not word-images but are experiences which are directed towards other signs based on the context of the receiver. To then try and take quantitative measures and apply them objectivity is an impossible task. Even one can take objective measures, these still have to be related to other people who are free to interpret the findings based on their own observations, meaning there is never a complete consensus agreed about the evidence collected and how it is to be used.

This leads Brophy on to consider post-positivism and social constructivism as qualitative approaches the may inform EBLIB by affirming the prevalence of narrative in human interactions. He argues that “These approaches suggest that rather than emphasising the transmission of “facts” (accepted knowledge about the world), modern societies need to encourage learning which encompasses both openness to differing world views and the ability to relate new ideas to existing knowledge in meaningful ways, so that each of us is continually constructing, sharing, and reconstructing our understanding of the world in all its complexity.” Such an approach emphasises the value of narrative in developing a more complete understanding of contexts and that can lead to greater decision-making for managers. This is because narrative allows one to look at evidence in context through structures: Culture, Holism, In-depth Studies, Chronology.


One practical way that this can be applied to a library’s ability to understand user services could be in the use of surveys. Surveys are, of course, a quantitative method of research. However, it is also possible to hold interviews with users about the survey itself in order to add a layer of understand to the results in which users have the ability to express their ideas in more subjective, less structured ways. This kind of evidence feeds into Brophy’s narrative approach. Of course, what it creates is a sequence of narratives which will still need to be gathered together into a coherent structure so that it can be applied to improving a service. In order to truly understand a user-group, one not only needs to create quantitative analysis of their habits and needs, but also to understand why they behave as they do, their motivations and their needs or desires. In order to successfully achieve this, managers need not only to know how to read graphs and charts, but also need to be able to read and understand people. And for this compassion is a quality that all good readers and subsequent narrators retain because it allows them to more fully understand the qualitative aspects to social interactions and systems.

Is ‘managing’ innovation counterintuitive to creativity?


In ‘Innovation and entrepreneurship in information organisations’, Rowley explicates the term ‘innovation’ in relation to the concepts of entrepreneurship and creativity. He argues that as organisations come under increasing external funding pressure that there is a growing need for greater innovation within the information sector. However, this innovation needs to be managed and processes need to be put in place in order to ensure innovative ideas are effectively utilised. I am not altogether convinced that Rowley’s ideas are adequately thorough in order to ensure a promotion of innovation rather than a suppressing of it.

Rowley describes innovation as ‘a multi-stage process whereby organisations transform ideas into new/improved products, services or processes.’ They can effectively do this by managing these ideas through a controlled process. This process is derived as a kind of entrepreneurship in Rowley’s essay. In order for innovation to be fully realised, he argues that we must recognise that innovation is inherently entrepreneurial. Definitions of entrepreneurship all revolve around taking ideas and turning them into success whether by being creative or utilising creative resources appropriately. Thus, the entrepreneurial process is one of interaction between individuals, their social networks, structures that objectify opportunities, and physical contexts. Entrepreneurship in this sense is explained as a systematic practice of innovation. This concept is then finally related back to creativity as Rowley accepts that all innovation comes from an initial creative design.

However, the one question that Rowley struggles to answer is how to put in place a framework that will allow employees to create the ideas in the first place. He accepts, for example, that innovators do not work well in bureaucracies and that Google’s ideas of allocating 20% of employee’s time to allowing them to be creative is unlikely to work within the public sector. Of course, Google can do this because they are generating their own profits, where most libraries are allocated budgets by their parent organisation who will demand that resources have a quantitative result. Rowley could perhaps have looked at the concept of ‘synergy’ as one way of promoting creativity within organisations in which employees are made feel ‘part’ of the organisation. However, cutting wages, reducing staff and demanding ‘more for less’ is not going to create synergy.

Finally, it is positive that Rowley is encouraging organisations to put in place a framework for managing innovation. On the one hand, the framework in itself may aid in increasing innovation in that it shows staff that new ideas will be developed in a serious and effective manner. Coupling this framework with some kind of rewards programme may well be a solid starting point for fostering an environment, even within a bureaucracy, for more creative ideas to emerge. However, and on the other hand, I am always skeptical of the ideas of controlling creativity because to me it seems counterintuitive. Rowley’s framework proposes to essentially institutionalise ideas so that they can be controlled and channelled towards results. This may well have the opposite effect of discouraging creative thinking in an organisation.

IT = Innovative Management System or Panoptic Hegemonic Control


Fatat Bouraad, in ‘The Emerging Operations Manager’, puts forward the thesis that the increasing reliance on IT services and IT skills based staff needs a framework in order to develop new methods of management. This is because as IT becomes more prevalent, new methods of observing, evaluating and managing staff also emerges, allowing for shifts in management styles. However, I think it is important to ask whether we are managing staff through IT, or whether IT is becoming a mechanism for a more totalitarian style of management in which the machine allows for an even more strict top down management style?

I recently read an article by Mike Sosteric called ‘Endowing Mediocrity’ in which the author posits that IT in all forms comes from an increasingly prevalent surveillance culture within business, education and social media forms of expression. This surveillance is of course facilitated more easily through the use of IT, but rather than creating a more flat structure, it tends more towards a deceitful panopticism. Sosteric (1999) argues that “Panoptic systems thus function as systems of behavioural and ideational (hegemonic) manipulation and control.” So Bouraad may argue that IT allows for greater efficiency and a tendency towards a flat system, but he also argues for a framework through which this flat system should operate which is somewhat contradictory. There needs to be an understanding that as we move more into the realms of IT based systems, that all of our actions are constantly under surveillance by the hierarchy that we work within.

Snapshot 2009-08-07 15-32-18

Furthermore, modern communication systems were actually designed to create greater control over human targets. I use this language deliberately. Norbert Weiner is the father of modern IT based communication systems. He developed these as a way of controlling military missiles during flight so that they could become more accurate. The endgame was always to gain greater control over the end user/ target. The same system is now used in modern computing. In the ‘know how to be’ stage of regulating new operations management theory, Bouraad argues that employees must remain up to date in order to developed a continued propensity towards innovation. However, innovation rarely comes about within an environment of surveillance. Most companies are either trying to control employees or they are attempting to control the consumer habits of targeted customers. Information industries have been contributing to this manipulation of end user increasingly through the spread of Big Data and internet monitoring. These issues do have serious implications for libraries also as they move more towards digital and online forms of dissemination. We should of course embrace the many benefits that IT give us, but we should never lose sight of where this IT has come from and the negative impact it can have on the personal liberty of our information professionals and the public they serve.

UCD Law LibGuide Review


The UCD Library Subject Guide for Law is very well laid out and follows the same style and structure as other subject guides making it easy to navigate. The webpage is clearly organised and provides easy access to a Subject Library Specialist whose contact details are posted in an individual text box on the main page. The menu bar also provides access to a range of resources that are available to students, ranging from books, journals, databases, websites, government information, newspapers and statistics. This range of material is more comprehensive than many of the other subject guides on the UCD Library Website. In the individual descriptions of each resource there are some law-specific terms that users from other disciplines may not understand. This is an important distinction because many subjects now have a multi-disciplinary element to them which means it is likely that students from outside the Law Department may need to navigate this particular subject guide. More user-friendly vocabulary would be helpful to such students.

The LibGuide for Law also does not contain the same video guide to using OneSearch that is on other subject guide pages, however, it does provide an external link to more general library user guides. This is disappointing in that a user who is not familiar with library search strategies will have to navigate away from the main subject guide page in order to learn how to use the search engines needed to locate the information and resources that the guide is promoting. This is a feature that could easily be integrated into the current site.

Finally, one element of the guide that I really like is the UCD Law Department Twitter feed which runs at the side of the page. They also compliment this with widgets that link to other social media platforms that they run. Studying a subject is not just about finding information, but is also about contributing to a community of like-minded people. The fact that the Law Department acknowledge and understand this is very encouraging. Advertising these social media platforms also provides more exposure for the department across a much broader range of library users which obviously enhances the profile of the department across the university.

Building Speculative Future Capital into Digital Curation and Embracing the Changing ‘Signs’ in Metadata


The following essay aims to discuss the case study ‘Using The DCC Lifecyle Model to Curate a Gene Expression Database’. There is little doubting the benefit of such a project. Gene expression in early human foetal development can help scientists and medical professionals to better understand human growth in relation to the contracting of disease both early and later in life because it forms the foundation of how all human life develops. Recent research dictates that most genetic diseases begin in this early stage of human development so an archive that allows scientists to draw upon previous gene samples and subsequent experiment results is invaluable in understanding where disease comes from, and by extension, how to prevent and cure it. However, digital curation, no matter how much planning and policy outlining is involved, and no matter how valuable the collection is, is ultimately at the mercy of financial sponsors. Therefore, while it is important to plan through the full life cycle of a project, it is also just as important to build ideas for future ongoing funding into the project, and if possible, to either make the project profitable or to suggest ways in which the project could generate revenue based on possible discoveries. In this case, clear guidelines have to be made about future rights of the project even if funding is taken over by private enterprise that are interested in potential discoveries that could be made based on the use of those datasets. In this sense, the Gene Expression Database runs into some problems that could hinder its long term sustainability. The case study focuses quite heavily on the technical challenges of the project, as well as on the cycle from creator to end-user, or designated community, however, some of the technical plans are upset by a lack of clear financial planning, while the designated community needs to have clearer policies about future human infrastructure and how the representation information and metadata may evolve beyond the life of current creators and users.

Firstly, the case study provides a strong focus on the technical processes and applications that will be needed to provide long term security and accessibility to the gene expression data base. This is emphasised in the study because (O’Donaghue & van Hemert 2009, Pg.58): “One of the main concerns on the informatics side of the design study is how to curate this resource over the long term. DGEMap will not be a simple archive of images, but rather a constantly changing project with several types of research output and digital assets that will require both coordination and preservation.” The results of the experiments will be processed in local databases where raw digital images will be created, cleaned using photoshop, representation information and metadata added, before being transferred to searchable databases online. The datasets will then also need to be archived and stored in a long term digital repository. DGEMap, in this sense (O’Donaghue & van Hemert 2009, Pg.59) , “comprises two constantly changing databases and a large quantity of images that need to be transformed and mapped before being submitted to one of those databases”. The aim of this essay is not to go into detail describing the technical processes other than to say that the curator has decided to use all open source programmes (ie. MySQL, DRAMBORA, AONSII and SIARD) in order to facilitate later open access to the databases and to ensure consistency across all platforms. They will also apply the OAIS model for the same reason, which will be used in conjunction with MISHFISHIE, METS and PREMIS.

It is surprising to find, with such detailed technical planning, that there is one glaring shortcoming in the case study. That shortcoming arises in relation to the project’s funding and their budget which in the case study the curators of DGEMap do not appear to have full cognitive control over. The project has funding for 10 years from the European Union, however, they envision the project having to be sustained for an indefinite period of time. Also, the actual budget that has been allocated is never discussed within the case study which prohibits readers from fully understanding the policies of the project. For example DGEMap (O’Donaghue & van Hemert 2009, Pg.64) propose the use of “the Dark Archive In The Sunshine State because it is intended for back-end use to other systems, so while having no public interface, it can be used in conjunction with other access systems which adds to the protection of the data”. However, they (O’Donaghue & van Hemert 2009, Pg.64) go on to add that “one problem with this usage is the exorbitant costs which may not be sustainable over the long-term. DGEMap propose investigating other storage options further”. It is clear why they want to use the DAITSS, but it is worrying that they do not have the budget to use the most suitable storage facility, especially for even a ten year period, let alone thinking about their indefinite storage needs. The questions that remain over funding issues do call into questions the longevity of such a project once the EU come to the end of their funding obligations. This uncertainty naturally is exacerbated by the fact that the EU can undergo considerable economic fluctuations which can further disrupt long term funding commitments.

One might argue that a curator cannot plan for an indefinite period and that if one were to attempt to do so, the project would never begin in the first place. However, because data is going to be constantly added to the database, the case study could do more to emphasis the ongoing process of conceptualisation of the project. The first cycle of data has already been conceptualised and funding is in place for the initial stages. However, the DCC Lifecycle Model is not a one cycle model, something that the curators of DGEMap identify but never build upon. They [O’Donaghue & van Hemert 2009, Pg.68] accept that as future users access the data and use it to perform new analyses, the data is transformed and re-entered into the start of the cycle again. It is the argument of this essay that this ongoing reconceptualisation allows space for the curators to continually update potential future funding partners of the achievements of the project. One experiment builds upon previous experiments and with the new knowledge sets comes closer transitions to new discoveries that could lead to the development of new treatments for disease. Each time a reconeptualisation happens, these discoveries, or transitions towards discoveries could be capitalised on to continue adding scientific and monetary value to the project with a view to acquiring future funding. The DCC Lifecycle model allows for this reconceptualisation and this needs to be written into the policies of DGEMap.


The second area of interest in this case study lies in its explication of policies based around human infrastructure. Obviously, there is going to be an ever-evolving body of participants in this project as new creators are generated. These creators are initially responsible for adding representation information to the digital files. Also, this information is later checked by curators with the aim to maintaining consistent linguistic labelling of the information. In this sense, the policies of DGEMap aim to control human infrastructure to ensure consistency. This is why they use PREMIS, to ensure the information must remain readable by the community which means metadata will need to correspond to knowledge in the designated community. However, this writer believes that any attempt at enforcing a static nature onto language is destructive to the necessary evolution in such projects. Contemporary linguistic theory has been demonstrating since the 1960s that language is far from static. On the contrary, language is an ever changing malleable condition in any discipline. Within the realm of science, the labels that we use to signify meaning can play a role in promoting creativity in research and experimentation methods. Because the datasets are stored in two locations there is always going to be a static account of the results and the images. Allowing creators, who are entering the project with new evolving perspectives and by extension new, more relevant linguistic codes to develop the language used in the metadata in a natural way, can only lead to the speeding up of the discovery process. In this sense, words do not come into existence from a vacuum, but are an ever-evolving chain of signifiers that can add meaningfully to the growing body of knowledge that is being stored. Again there is plenty of scope within the policies of DGEMap to develop in this more open and organic way. For example, when referring to the monitoring of the designated community, the case study (O’Donaghue & van Hemert 2009, Pg.68) admits: “Effectively, DGEMap would be harnessing the knowledge of its designated community to increase the use and importance of the public database, allowing an even better resource to develop over time.” There is a sense that monitoring is too distant a term and that more control of the project needs to be given over to those creators and end-users that are developing it.

In conclusion, this essay has attempted to examine two important features of the case study on Gene Expression Data Storage (DGEMap) in ways that might allow the curators of that project to develop their policies more stringently. The first part of the essay examines ways in which budget constraints and long term funding uncertainty can undermine the most careful technical and conceptual planning. The essay has, however, suggested that through more fully utilising and emphasising the ongoing conceptualisation of new datasets that are entering the lifecycle of the project, the curators can build in a framework that allows that project to constantly speculate for new funding partners beyond the ten year EU funding base. The second part of the essay casts some doubt on the project’s attempt to control the human infrastructure in a way that may hinder the development of new discoveries. It suggests that digital curators need to more fully understand contemporary linguistic theory which can inform them to embrace the unavoidable nature of linguistic evolution which can add even greater momentum to the discovery process. This is especially true in that curators aim to store information for future generations. If there is a linguistic disconnect between end users and original creators, then the data may become misinterpreted or even linguistically obsolete.


O’Donague, Jean & van Hemert, Jano I. (2009), ‘Using the DCC Lifecycle Model to Curate a Gene Expression Database: A Case Study’, The International Journal of Digital Curation, Issue 3, Volume 4