Tagging and analysing
data in Housing
It doesn't have to be a manual process. There are better things you could be doing with that time!
How do you handle your data now?
So you know where all your data is and who it belongs to. Perhaps you even have access to it all. The important question to ask — what's your process for categorising and analysing it?
- Are you manually trawling through spreadsheets?
- Do you use rigid keyword taxonomies to categorise feedback?
- Do you perform consistent sentiment analysis?
- Can you track how your data changes over time?
Ask yourself (and your team, if applicable) these questions to help understand where the gaps are in your analysis processes.
How can Wordnerds help?
Having worked with various housing associations, we have over time accumulated data on how these organisations measure and categorise data. We've created a plug-and-play bank of language models trained on UK housing data designed to make it easier for insights types to get started on their journey to data greatness, while allowing for HAs to benchmark across the sector.
We can also share case studies on how we've helped your housing colleagues — always good to know what's working for others, right?
But what if you just need to know the essentials on how best to tag, categorise, and analyse your data? Well, we've got your backs. Here are our most valuable resources on that very topic:
Nerdcast-live:
Complaints handling
How can Housing Associations navigate the increasingly tricky landscape around tenant feedback, while maximising learnings from complaints?
Case study:
Karbon Homes
Have a look at how Karbon Homes initially uncovered patterns in their data indicating a problem with damp and mould, and how they tackled the problem.
Nerdcast-live:
PAPA framework
Using data from UK Housing Associations, Wordnerds put together a framework for tagging, prioritising, and actioning tenant feedback.
Research:
Bias in AI
Using AI in our day-to-day, we thought it might be valuable to conduct some research into its limitations. Turns it out has biases just like us.
> Read now