I'm sure everyone knows someone that they would refer to as a Cyclist. This is a very important distinction - these are the people who think that cars are a nuisance clogging up their roads, & they are downright dangerous (dangling modifier left to the reader to appropriate).
I've never been a cyclist. I've ridden my bike - often stupidly down narrow edges on the side of a fast open highway - but not to the point of being an enthusiast, rather as a dare-devil.
When I got a motorbike licence, things changed. The first thing you notice is that there's next-to-nothing between you & the road. Once you get over that, you realise that what every motorbike rider has told you is true - car drivers are out to kill you, & it's your task to determine which of them is most likely to succeed. Please only laugh if you ride on two wheels regularly.
I believe it comes down to this: car drivers have stepped outside of the real world when they step into their tonne-of-steel death machine.They no longer have control of their destiny, because they don't have a clue how a vehicle works, & therefore it must be someone else's fault if the brakes aren't applied quickly enough, or the vehicle is moving too fast. They couldn't possibly be in control of such a behemoth. It defies logic to believe that they have the nous to direct such advanced machinery's destiny. People abrogate all responsibility when faced with the use of a technology they do not understand.
Road rules don't apply. Every driver knows the rules, but the cars don't. It must be the car's fault that it drifted into the other lane, because the driver wanted it to go straight. That's a shame. The car might even get damaged if it does that again.
This is quite different from what goes through the mind of a rider - I must take that next corner at such-&-such a speed, tilting the bike to such-&-such angle at approximately such-&-such position within the lane - or else I will probably die.
Interestingly, I was putting these ideas together while I was walking - reminded of a Piers Anthony novel about Satan... That's what I do when I've got lots of time on my hands - go for long walks to do things that might take only a few minutes if I was driving. The thoughts that go through your head are along the lines of "What a lovely hot day - look, some shade to walk through!", "I wonder if I'll get to the next street in one change of lights or two?", "I wonder what time it is?"
These are the thoughts of someone on holidays, admittedly - someone who doesn't have to be somewhere right now & can't blame anyone else if they're not.
I was walking at my own pace. I was using my own legs. I had total responsibility for where I was, how fast I was going, where I was going to, how, & whether I could just stop for a second in the middle of the footpath to see what the next tune on my mp3 player might be.
Thoreau had it right a long time ago - being in complete control of your own destiny, & having responsibility for what you do, & how you do it, is truly liberating, from a spiritual point of view, even if it puts the onus for everything back on you. That, in itself, makes you re-evaluate everything to determine how much of it is actually necessary or useful, & how much of it is simply the guff that society expects of us with no return.
We've lost our way, my fellow bipeds. We've all become users of technology - drivers of cars - machines we don't understand & don't believe that we can control. That vehicle is a metaphor for what I will now laughing call "lifestyle". We need to get back on two wheels to understand the environment that we're whizzing through thoughtlessly otherwise, & then we need to get back on our own two feet & just enjoy the journey of life at a pace far more natural.
31 December 2012
In the Name of Service
I don’t want to sound like I didn’t have a good time, but I couldn’t help noticing - or being reminded that I always notice - the little things that make being away from home such a drag. Staying in a hotel should be fun. It should be full of little (nice) surprises, like a particularly nice bathroom with gold taps or a forceful shower that makes you want to luxuriate under it for just that little bit longer.
What I found, however, was a small bar of soap hidden in a box next to a “sanitary pack” (don’t ask). The shower had the kind of half-heartedness displayed by the night-staff - as if it didn’t want to be there either.
The temperature controls took a day to work properly - or perhaps when my cousin visited, he managed to jog it back into life. I froze on the first night, with the heat on maximum. Speaking of cold comfort - what’s the point of having a fridge in the room if it’s filled with things you know are exorbitantly priced? Can’t I have the option of getting an empty fridge?
In the next hotel, the fridge had some room, but there was no bottle opener or wine glasses once I got to use it!
I am not talking about cheap or low-star hotels, either - one has an undeservedly good name, & the other was considered excellent in a highly competitive area. Its decor was classic & stylish - the staff were uncouth & inexperienced.
Two incidents come to mind. My good lady left her phone charger at home, so all we had left was the car charger with no intention of driving around in circles. I rang reception to ask if we could borrow a charger & they said that they had lots, but I’d best come around with the phone, just to be sure. I should point out that the phone in question has one of the most common mini-USB charger ports currently available. Out of a box full of chargers, not one was less than five years old. My guess is that they had been discarded by previous guests over the years as being useless. The staff were unapologetic. When I later asked in passing the front desk if I would find a corkscrew in my room (we were in a wine district), the desk attendant had to shout out to a duty manager to ask if that was a normal item provided, & the shout back was an assurance it was. Fortunately, I ignored them & took one from the bar.
The restaurant in that establishment offered fusion cuisine, which I usually would avoid, but I was so inebriated from a day tour of wineries that I felt compelled to give it a try. Again, the converted sprawling estate provided a wonderful atmosphere for the food to be served into. However, the staff took an approach to the clientelle that reflected their collective training at the local burger emporium.
The perky words “How is your meal?” sounded so much like “Would you like fries with that?” Believe me, fries would have been quite inappropriate for the whole menu. They would have looked about as in place as the “Australian Menu” section in a country town Chinese restaurant (where you find the fish & chips offering).
I was tempted to take the express check-out (on both occasions), & I resisted writing comments on those nasty little feedback forms (OK, so I did do it at the restaurant, but I was drunk - so they’ll never understand it). I would love to know if anyone reads those things, or whether they at least take the time to look at the statistics of how little actual feedback they get.
What happened to the good old days where the differentiation between stars was noticeable in the free stuff - the quality of the shampoo, the robe in the cupboard, the view over the pool (or ocean), people to carry your bags - all of the things you see in old movies. Now, everyone who is no-one goes to a hotel for a night with no expectations (always fulfilled), just for the branding, rather than the experience.
I want to be pampered. I want to feel as if I am getting “something” for my money beyond four walls, a mediocre shower, & a nicely made bed. Is that really too much to ask? I want some service from someone who may be surly but is at least professional. I want to feel as though I am special - a guest - not a customer.
Yes, I know, you can’t go to a prostitute & complain about not feeling loved, but I would at least like to think that one of us is faking it.
Democracy - What is it good for?
- a right to vote
- representation
As for representation, what is being represented? Whose interests? In many systems that have inherited the English parliament, there are effectively only two parties vying for power. These don’t represent anyone but themselves. They try to attract the votes of the people (voters). In some countries, the voters are representatives of the parties!
Under these circumstances, it is very hard for anyone outside of the two parties to get a chance - with some exceptions. We always talk in terms of two-party preferred polling (that is, the last two candidates standing when all others’ votes are exhausted). This is an essentially meaningless number because it ignores what people actually voted for. If the two parties represent only 70% of the primary vote, say, then that’s a lot of discontent with the two-party system.
It doesn’t take much to work out that those being elected do not represent those doing the voting. Therefore, you need to either change the pool from which those being elected arise (that is, remove the parties), or else change who (or what) gets to vote.
In a country where a third of the population weren’t born here, it seems strange that there is no political force that represents migrants. It makes more sense for a community-based voting mechanism. New Zealand’s parliament, for example, has a guaranteed native representation. Australia’s native population is quite small (relatively), but representation is almost accidental.
This would provide for large blocs of people - communities - get representation, without having to live in neighbouring streets.
Going beyond people, should corporations or charities, or organisations that represent special interest groups get representation within government, rather than being lobbyists? If they have any influence over politicians, then surely they should be a direct part of the policy-making process & skip the wasted effort of creating media attention to get their cause at the forefront of the elected officials’ minds.
I’m not talking about having all corporations or groups represented ‘democratically’, but bodies could be formed that allow for a different type of representation that has a different type of input into the parliamentary system, in the way that representatives of the people are supposed to bring the concerns of their constituents, & likewise the representatives of the states in the upper house.
In a two-party system where both of these houses are dominated by party politics, then it would have to be more effective to remove the one-up-man-ship & introduce issues as the primary focus of the agenda.
Are we mature enough to evolve our parliamentary system with the changing needs & advances in technology that allow us to be finer grained about the demographics within the electorate? Could we not add more dimensions to the governmental body that provides for people’s interests (vested interests, not past-times)?
We don’t need multiple houses, but we need collegiate discussion of policies for the good of the country - all of it. This can only occur when the elected do represent the people (however they arrive there). Otherwise, we get stuck in the past, fighting the same irrelevant idealisms from centuries long gone.
Left & right? It’s time we left it all behind & got on the right track for a truly democratic future.
Political Commitment
Political commitment means actually putting your neck on the line for what you believe in, rather than dithering about wondering if the electorate (or media) will crucify you for your convictions (there’s an interesting mixed metaphor). There’s no point in trying to save your bacon. This is why people don’t trust politicians - the ones we have now just won’t do this. This is also why such politicians who would do it don’t exist - because the political parties that select candidates don’t trust exactly the type of politician that the people would!
There are exceptions. We call them independents.
Every so often, such a politician will pull a stunt to draw attention to their cause, but without the backing of a large political party, the media is unlikely to get too interested *unless* there’s an interesting element to the story. If the media don’t cover the story, then people won’t hear about it. Let’s face it, an independent is not in any position to churn out reams of propaganda for the masses. They rely on their convictions to generate the stories & to show their electorate just how independent in thought & deed they are.
Strangely, if you try to build a party around an independent thinker, then you are destined for failure - eventually. One or two elections down the track, that maverick streak that people loved in one person becomes a party line that “the establishment” (where did they come from??) insists is followed. This, of course, both demeans the initiator, & pushes away the public following that loved the maverickness. In Australia, in particular, this has happened in several instances - Democrats & One Nation.
True independence of political party politics means being able to commit to what you, the person elected, believe in. It would be great if this could also mean having the support of an infrastructure that generated interest in your views, helped to disseminate the truth (as opposed the hype), & took away some of the mundaneity of the electoral process, but that would require a kind of altruism usually lacking in those involved in politics across most countries.
As long as politics equates to “power” in one way or another, then some will be political for the sake of being in or near that power. Let’s face it, otherwise there’s nothing “in it” for them. They want their team to win - which is an interesting way of thinking about it in itself, that there is only one winner, & therefore several losers. I don’t want to hark back to the “up to half of the voters don’t like their representative” discussion, but if there were more winners, then there would have to be fewer losers (representatives & voters).
*I just thought of a great way to elect a government, but I’ll have to wait until later, otherwise I’ll muddy this blog entry.*
I want my government representative to commit - to both make promises that are followed through, & show a history of doing so. I want a politician who has no fear, but has some consistency that I can trust over the period of entrenchment. If they showed that mettle, then they’re more likely to be re-elected than someone of whom I know nothing more than that they pay their dues to a particular political party that had once espoused views so diametrically opposed to another politcal party that they had to be at constant war with them.
I don’t want a battlefield for government. That makes no sense. There’s no war to win.There is no being on the side of right (or might).
I want debate. I want policies. I want decisions. I want progress. But most of all, I want my representative to show some political commitment.
In exchange, I promise to commit my vote to them next time.
On Your Bike
There are many instances of good governments - despotisms, monarchies - without democracy, & plenty of examples of bad democracy. Any scientific examination would show that there is no statistical correlation between the two. Any attempt at a double blind trial would be interesting, to say the least.
Most people want democracy because they think it empowers them - that is, they feel as if they have input into the system, & that they are clever enough to contribute. Note that their contribution is not to actually do anything within the running of the government, but to bitch & moan about what the government isn’t doing for them. “Ask not what your country can do for you …”
Middle class voters in a democracy think that they know better than those they elect to represent them. I make this distinction because anywhere that has a distinct upper class knows that the government doesn’t affect them, & the lower classes never get any real social change out of a government.
The vocal middle class, which forms the majority in western democracies (by definition, I suspect) expects that the government does its bidding, but in doing so, they put their direct faith in political parties making a land grab for the middle ground to represent them. You can’t be too conservative (seen to bolster the entrenched money & big business at the expense of the average punter), nor too radical (introduce social change that diminishes the differentiation from the disenfranchised).
It would seem that politics has no place for rerpesenting those two groups, no matter how small, & they are therefore unlikely to play an overt or direct part in government. A classic example in Australia is a quite competent self-made millionaire who was constantly referred to as silver-tailed, so had to step down from party leadership, no matter how clever & energetic he was. At the other end of the scale, how likely is it that a homeless or disabled person will be elected? Even a long-term unemployed person?
Very occasionally, someone under thirty is elected, almost by accident.
Strangely, when we create plebiscites, we ‘ensure’ representation from a broad range of demographics. For something serious, like policy-making within the government, that is not deemed to be appropriate.
One of the problems with a democracy is how the electorate is structured. If we assume an homogenous voting population, there is no issue - all voters would be like all candidates, & there would be some ‘fairness’ such that representation would be based on an individual’s ability to sell themselves & their policies. However, the population is not homogenous (thankfully), & voters are a large cross-section of the population. Even in Australia, with compulsory voting, those who aren’t citizens (regardless of length of residency) or old enough are not represented.
Within any given electorate, a minority is likely to be represented by their candidature. Minorities include non-European, under twenty-five, over sixty, unemployed. In some cases, each ‘minority’ might be significant, but they won’t band together - the young & old won’t join forces to topple the middle ground. If this happens in each electorate, then the total parliament is always likely to be a collection of the least offensive candidates across the electorates.
In fact, it’s almost as if the major parties, whose sole purpose is to fill the parliament with their trustworthy members, seeks out ‘sound’ non-confrontational voter-friendly puppets to dance in front of the electorate & smile sweetly for the (local) media to garner votes from the new disenfranchised - those without proper representation. The race is on to get a second preference, for example (in a preferential voting system).
It doesn’t matter if your primary vote is ‘wasted’ on a candidate who stands for something that can be believed in. If only ten percent of the electorate agree with you, your vote goes on to your second choice - someone who you didn’t believe in as much. Your principles have been compromised. As counting continues & candidates are discarded, it may come down to the order in which your last two choices were made - or else your vote has expired & been disregarded - your opinion is no longer represented.
This is an implementation of democracy. I don’t feel empowered. I don’t feel represented. I don’t feel as though it can guarantee me a good government. I see no evidence of it.
A good government is one that makes good decisions for the good of the country - the people within it, not necessarily the people it represents.
If the two functions of government in a western democracy are to create policy & oversee its implementation, then neither of these are going to be done well if the policy makers are chosen from a select group of middle ground people.
There has to be a way to ensure representation of the elctorate, but not necessarily within electorates. By that, I mean that there is something fundamentally wrong with forming a government from a majority within one group of elected officials, where each is the least worst representative of a geographically-defined cross-section of the otherwise heterogeneous population, having been selected by organisations who represent little more than themselves.
As tortuous as that paragraph was, it’s probably the only answer to the question I ask nearly any time I see a politician on the television - “How on earth did that idiot ever get into parliament?”
Politics & Government
That ideal being far from likely, we have governments consisting of the few to make decisions that affect the many.
There are many different types of government, but fundamentally they perform the same task - put people who are supposed to be more clever than the most stupid person in a position to make decisions in their (the stupid people’s) place. This usually takes the form of producing policies to be implemented by something the equivalent of the public service.
In a democratic system, this body of government is elected in some manner to represent some qualified element of the population (by birth-place, age, gender, nationality, etc). Under these circumstances (people’s representatives) it often has the role of overseeing the public service (rarely controlling it). In a more autocratic system (monarchy, despotism), the government may be a part of a hierarchy that manages the implementation of policies.
My interest is in elected representation - what is euphemistically referred to as democracy. Because it is ‘expected’ that an elected official represents their constituency, there are all sorts of pitfalls in the system. For any voting system, no representative actually has the full support of those who elected them. By definition, unless they were the only candidate & anyone could qualify to stand, then *someone* wanted someone else in the role.
Even those who voted for the elected official don’t necessarily support them - they could be the best of a bad bunch, the devil they know, a party hack, a donkey vote, blind luck, a mistake, or any of a number of other unfortunate side-effects of the system. In Australia, for example, where voting is compulsory, *someone* has to get the vote - or else you vote informal (invalid) - in which case, your opinion is ignored.
This is the root of the problem with democracy in general. It produces people who are by definition disliked by some percentage of the population that they represent. It also makes that person elected dependent upon the favour of voters to retain their position if they want to remain within the government. This rather ties their hands in terms of being effective. Nobody elected agrees whole-heartedly with the people who elect them - because of misinformation, mistaken identity, party affiliation, propaganda, the quality of the opposition, etc.
Now I’ve brought politics into the fold. Political parties like to hold power. I’m not sure what they do with it, but they are greedy & like to hold it - or at least ensure that no other party holds it. That’s what they do. It’s their sole purpose. People join political parties to be a part of that success, to feel the warmth of the power in the party, to share in the reflected light. A party makes them larger than they are themselves.
It doesn’t matter what the policies of that party are, the fact of the matter is that the party itself has a self-preservation built into it to ensure that it gets power - re-election of its representatives - regardless of its idealisms or the people that it is supposed to represent on the ground level.
On this basis, a party will never be too controversial. This is a given. This means it will never intentionally introduce policies that - no matter how good they are for the population, society, country, world - will upset the electorate. But isn’t that exactly why we need elected officials - to make the decisions subjectively that we wouldn’t make subjectively? Don’t we abregate our responsibility by putting our trust in the democratic representative?
I have always said that we get the politicians that we can afford - that is, if smart people can get a better livelihood & satisfaction out of not being in politics, then only extreme altruism could draw them to it.
Similarly, political parties can only allow those who toe the party line into the upper echelons of representativeness. Any official who is a danger to the party’s electability cannot be allowed to voice their concerns or make public statements that might be embarrassing to the party, no matter how well-intentioned.
The system does not work.
I have no choice in the collection of people from whom I have to choose a representative - unless I stand for election myself. If I am lucky enough that the person for whom I vote is popular with my neighbours, & I agree whole-heartedly with them & the party that they represent, then I may otherwise be happy, or blissfully ignorant of what I lack - true representation.
What’s the alternative - allow everyone direct input on every governmental decision? Worse than anarchy. That gives people all the more reason to bitch & moan about their opinion not being carried through to policy - or else policy-making stagnates to general popularism.
A better thing is that people who want to be representatives be trained as such - have to pass some sort of civics course that includes an understanding of the workings of government & an appreciation of how the system works. In my heart, I think that all voters should need a qualification, but then the uneducated have no representation.
From those who are qualified, all should stand for election. The election should then be random, across reasonable lines of geographic, demographic, ethnic, gender & age spread. The terms of reasonableness would have to change over time from its current randomness which we shall call a baseline of incorrectness, towards some form of ideal (more young people, more women, whatever) until a happy balance is achieved. These random people are given a fixed term.
Those willing & who qualify during that term as being most active & effective are thrown into a pool from which returning representatives are randomly chosen (again, the number can slowly change from whatever the average is now). The rest are still qualified for general election, if they so wish to stand after their experience.
The only politics left are around issues, rather than around re-election. That is something I could stomach as healthy debate, & keeps the politics out of the government.
Unfortunately, the only way I can get these changes in place would be to sneakily be elected under the wing of a political party & work against them, or else become so popular a solitary figure that people listen. I could also campaign from outside of the elective system, but that means no practicing what I preach.
Let me think more on the subject …
Quality as a State of Mind
One of the things that always disturbs me about introducing quality processes to a business - no matter what it ‘produces’, is how the process itself is valued. This is just adding a wheel inside the wheel - nobody ‘gets’ quality; nobody understands ‘value’.
In itself, quality is somewhat nebulous - no-one can define what they mean when they throw around phrases like TQM (total quality management) or standards, in the general sense, or DQ (data quality) or best practice in a specific context. The understanding is always relative to the background & knowledge of the person making statements. Someone who’s been involved in quality issues in one arena is unlikely the know the specific problems in another, but will at least have a grasp of the ‘importance’ of having standards, processes, reviews, etc.
But knowing the importance is not enough - you have to value the idea of introducing Quality as a concept. This is why executive sponsorship is probably the most important aspect of introducing quality to an organisation that has so far been naive enough to believe that it is somehow immune to the perils of the tight-rope, yet continues to work without a net. That is, a start-up.
There are only two ways of getting executive sponsorship (& keeping it) - buy it through budget re-allocations that rob other business units, making it seem to cost the executive (or board) nothing. Or else sell the importance as a whole-of-business strategic step that needs its own board-level support in the next round of budgets, in some way that puts Quality on the same level as (or else a part of) Corporate Governance. In either case, if the executive doesn’t ‘get it’ - doesn’t value what can & must be achieved through a quality regime, then you might as well then blow that budget on making little signs that say “You don’t have to be mad to work here, but being angry helps”.
Executive sponsorship equates to management directives or buy-in. It also means independence from business unit in-fighting (especially over budgets). Strategic positioning within an organisation ‘guarantees’ precedence over tactical operations, in general. This all comes down to executives (CEO, fundamentally) valuing the idea of quality high enough to get it out in front of everyone’s eyes where KPIs can be written in, measurements concerning the quality process (not the business process that Quality is measuring) are understood & acted on, & the business itself can be turned around from its course heading into the whirlpool of self-destruction that quality-free blind navigation inevitably leads to.
Too dramatic? Perhaps. Does the metaphor hold? Probably near enough.
Having seen a few failed start-ups, one thing that is consistent is that early development usually cuts corners due to budget constraints - time or resources. Once ‘the money’ comes through, there is a lot of frustration that all of the success & progress from the early days is getting ‘mired down’ by the ‘extra weight’ of a public company or else private investors’ interests - that is, corporate governance. This is usually dealt with by cutting more corners elsewhere - which is easy when you’ve been used to doing that to create the product or service.
This superficial semblance of steady velocity does not take into consideration the changing environment - more obstacles in the path, & the path itself becoming more treacherous. There are two ways to deal with those changing circumstances - continue as you always have & metaphorically close your eyes, or else slow down & react to the changing surrounds.
When you close your eyes, you ignore what’s happening around you, miss opportunities, hit snags, occasionally crash & burn.
When you slow down, then you can work out how to deal with the new environment. That environment requires that you treat it with respect - business partners, service providers, customers - stakeholders. These are new entries into the start-up’s environment. The common thread for dealing with stakeholders is having quality processes in place that ensure that those relationships - imperative to the success of the business - become & remain solid, effective, & long-term.
If you value your stakeholders, then you must value quality processes (including communication with your stakeholders) in your business. If you value your business, then you must value the things that keep the business alive. Many start-ups, in particular, think that their business is based solely on one idea, one product or service.
It never is. It is always dependent solely on the relationships the company has. Many start-ups, with what has amounted to a break-through product, have gone under with the belief that someone has done them wrong - if only an investor hadn’t pulled out or a customer lost faith. The reality is that they have not taken care of the relationships that define the business. They have forgotten to change the thinking of the start-up to incorporate those relationships. They drive with their eyes closed.
It might be very Zen to attempt to run a business as if the rest of the world was a figment of your warped imagination. It would be more effective to become aware of an organisation’s operational needs, & stakeholders’ expectations of the business, & react accordingly. It requires a change in the value system - away from the product or service & towards the business. It requires a change in perspective from the delivery of widgets to the delivery on expectations.
This is achieved through an approach that values quality in all aspects of decision-making.
Where do you see yourself ...?
That’s one aspect of where I see myself. I see myself doing what I always do - whatever it takes to get the job done, with whomever I happen to be working with at the time, with whatever resources seem to be available, in whatever time has been allowed. If the team disintegrates in a flurry of personal issues, budgets get slashed, or some inappropriate schmuck ends up managing a project, it doesn’t affect what I do or how I do it. Fundamentally, my job has always been to keep the faecal matter off the rotating blades. I don’t see that ever changing.
Still, the question is asked. In a way, there has to be more to the asking than the fact that the question is always asked, has to be asked (because HR said so), & must be ‘interpreted’ thereafter. The person asking has to have an agenda, otherwise the asking doesn’t make sense. In many cases, I’ve got more experience than the one asking, so I could come across as a threat if I say “I’ll have your job”. Anyone too young is considered an arrogant punk for giving the same answer.
This question is supposed to be a way to discover someone’s career goals. The right answer means that a candidate is driven. Driven to what? Driven to undermine their team lead? Driven to look for work elsewhere when they get frustrated due to a lack of opportunity, not being noticed, or simply not being half as clever as they think they are? It doesn’t mean driven to make a project succeed. It doesn’t mean driven to be a team player.
An alternative to the “your job” response is to say something so naff, so completely wet, that you’re not taken seriously, like “I’d like to see this company grow to twice its size & value”. How? How will you contribute to that? Do you have any control? Input? What happens if it doesn’t happen? Will you leave in a huff because no-one listens to your brilliant ideas, or else management isn’t clever enough to implement them?
This all presumes that the interviewer gives a rat’s about the answer. If they’re just ticking a box that says “Question asked”, & another one that says “Didn’t respond with ‘on parole’”, then the job’s done. No animals were harmed in the making of the interview, but no progress was made in determining if the candidate is appropriate for the role.
In a way, one of the best answers is “In five years, I want to have beaten HR into submission so that they don’t force interviewers to ask such stupid questions.”
I’ve started wondering if I have career goals. I don’t think I do, anymore. I’ve done most of what I could nominally have been expected to do. If I do run my own company, that would be nice. I’d be happy enough to run someone else’s again if given the chance. But the doing it or not does not change me. I do not feel incomplete because I’m not doing it. I am not driven to do such things ‘at all costs’. I know people who’ve been like that. Good luck to them. I’ve often found such people impossible to work with, though.
People driven towards a goal are always looking at what’s coming up next, & how it can be used to progress them on their path. What about what’s happening now? If someone has their eyes on a point five years down the track, what about six months? What happens when they stumble? When the environment changes?
I know it sounds a little bit Zen, but I prefer to live in the moment. I like to be dealing with what’s happening now, & simply planning for eventualities in the near future, so that I can react with speed (agilely) when things change. It’s no good wandering about with only one plan for one eventuality. You need to have lots of plans ready for a wide variety of possibilities. Plans that could be developed further if they become more relevant - if circumstances tend to favour some plans over others. You need to be able to see the changing environment & react accordingly.
This is the basis of an agile methodology - whether you’re talking about IT projects or management decisions.
In five years time, I see myself not typing a blog like this - there will be a shift in technology that might allow me to speak clearly & record the words while transcribing. There might be a different technology for reading my brain waves & translating those into emotions & ideas. I don’t know what the future might bring, but I plan on being ready for it, rather than expecting it to fit into my plans.
It's Official
Wouldn’t it be Munchausenly wonderful to have pretended to be a pilot, a doctor, a lawyer - & all before the age of twenty-one! Regardless of your ability to pass off dodgy cheques, living the high life of a successful career without having to do anything to achieve it is just a dream come true. It’s an incredible story, even in its summary on wiki, which starts by saying that Pan Am estimates a fraud of 1000000 miles, 250 flights over 26 countries, over a period of less than three years … but then goes on to say that, in 1978, phone calls to his victim institutions revealed no evidence of any employment under the aliases given.
Curiouser & curiouser.
So, the book about the con was possibly a con. Now we have to determine if Frank Abagnale (the perpetrator) or Stan Redding (the author) was the con man, or if the institutions (banks, hospitals, colleges, the State of Louisiana) were “avoiding embarrassment” by denying all knowledge - that is, conning the one investigating the claims.
This just makes the whole mess more tangled & almost ludicrously funny, such that the movie could have been done by the Marx Brothers. You can imagine the slightly circular argument around the point that, if Abagnale didn’t do what the book claims, then some of the shine of his expertise (a very lucrative consulting business in the area of bank fraud) must rub off. Of course, this would bring into question how much he could be trusted - whether he is a con man or not!
In saying all of this, my point of concern with wiki was that it had no reference for who had made the inquiries in 1978, & whether anyone had interviewed the supposed co-workers of Abagnale at the institutions enuumerated - possibly with pictures of the man. This would not be too onerous a task for someone who showed any real interest. Too often ‘official’ documentation reflects what is essential for upholding the institution, not anything close to the truth - & which institution would reveal the truth over the phone, anyway? People’s memories rarely suffer from such problems.Better yet - track down those people now, because they’re closer to retirement age, & they will desperately want to tell you the story of the disappearing Chief Resident.
It was when I was describing this to my partner that she pointed out that I only had myself to blame for basing my speculations on what I could find on wiki.
Some great questioner of half-truths I turned out to be. I think I’ve just been conned - or have been conning myself. I have been selective in my investigative activities, & am no better than those I complain about. It’s better I deny all knowledge of ever watching the movie.
It’s official.
No Wuckas
The hardest context of all to assume is ‘general background’, because it’s easy (or convenient) to believe that you & the audience have the same one. All of the little quirks that come together to give you your concepts for a shared life experience come to nought when you are faced with an ‘alien’ who speaks the same language as you, understands you perfectly, but has no idea what you’re talking about. You don’t discover this until you tell a particularly clever joke & it’s met by a room full of blank stares.
There are, of course, very different types of ‘alien’ in the room. There are the ones who work in a different field to you - which is why accountants just cannot talk to software developers. There are people from different countires who realise that the slightest hint of an accent will mean that the other person didn’t watch the same TV programs, do the same classes in high school, or even play the same sports. There are even demographic aliens, where lifelong near neighbours (or even family members) have nothing in common because of age, or choices in education, or personal tastes.
There is often no way to communicate directly with aliens. You have to find the common ground - a context so broad that it encompasses what you want to say & the backgrounds of the participants. You cannot be too clever, because clever things are almost always contextual - they won’t “get the joke”, or else it will take so long to explain that any vestige of humour will be lost after a few new concepts are explained.
Thus, finally, we come around to the title of this blog. Every Australian over the age of, say, thirty (at least), will know what this means. Some may even remember the origins of the saying (or have some vague notion). Here’s the explanation (& I’m not getting this from wiki, so excuse my memory).
Once upon a time in the great southern land, the imperial leader was a well-respected popularist by name of Robert James Lee Hawke - “Bob” to all of his mates (which was approximately the whole population). Although a tea-totaller, he understood celebration, & when his beloved team wrenched the Admiral’s Cup from the Zeppoes, he claimed that any boss who sacked a worker for not turning up was a bum. He was that sort of loud-mouth. Popular at the time was a Tshirt with a half-crazed Bob (looking appropriately somewhat like a galah) shouting “No wuckin’ furries”, which spoonerism managed to get it past censorship & what goes for common decency in this country. Five syllables being two long, this was shortened to “No wuckas” when it became a standard element of Strine within a very short while.
Just as an aside, the direct translation “No worries” still needs context. The alternative (more common across the Pacific) “No problem” has its own context within Australia - or relative to Strine, at least. I’ve heard that the tone used is the most important part of interpreting an Australian’s response, as well as the actual words used. Australians are much more likely to tell you what something isn’t, rather than what it is, so “no problem” - or “no probs”, “no wuckas”, etc, fit in; but sometimes they actually mean the opposite. It is now common practice for the phrase “not a problem” to actually mean that there is one, indicated by the amount of extra effort (more syllables) in giving such a response.
So, if you ask an Australian to do something for you, & his response ‘seems’ to indicate that they’d be happy to, just double check by counting the syllables. If they could have gotten away with answering in fewer syllables, then they’re probably not happy doing it. Strine is about contractions - giving the flies minimal opportunity to enter the mouth.
Let me give you a few examples to illustrate …
Example 1.
Me to friend: “Could you write a blog for me, on the importance of assumed context when communicating with your audience?”
Friend: “Not a problem.”
Example 2.
Me to friend: “Ya got time f’r a frostie?”
Friend: “No wuckas.”
The Medium & the Message
Let me give you an example. SMS is a great way to send a message to someone when you don’t want (or need) to actually talk to them. However - & this is a big problem - the implementation restricts the length of the message, so a new language has been created to use this technology. That language is a barrier to communication unless you know it. This is great amongst a small circle of frequent messagers, but not so good if you’re trying to convey complex instructions. The medium gets in the way of the message.
Speaking of short messages, Twitter (& microblogging in general) is supposed to be good for just telling people what you’re up to. You can blast out gratuitous updates to your homeys (or peeps, or posse, or whatever is a trendy way of describing people interested in what you have to say) ad infinitum - as long as you restrict the message length.
But tweets are more than just the update, they are a social sharing of knowledge because they broadcast to a wider audience. They are the opportunity to share wisdom, rather than communicate one-to-one. Because of this, the usefulness of that sharing has made us invent another language - tagging of the message. For those not into Twitter, this means that your message can ‘adopt’ a relevance to various pre-defined topics, so that people who are interested in those things know that’s what you’re talking about. They can search, they can browse, they can acquire knowledge more easily - as long as the tags make some sense, are consistent, etc.
This is a communication nicety generally lacking in English, for example. In fact, English is one of those rare languages where a double meaning is used in humour to specifically jar the listener into a different frame of reference, using puns & misdirection. If you had to tag everything you said, then the punch-line would be given away ahead of time.
SMS & microblogs also have that communication advantage of being transactional & text - you get the whole message in one block, & you can see (& understand) it all together. There is no waiting for the sentence to finish, or for the next sentence to start. You have it all in front of you, & a proficient reader will ingest the message without breaking it down into words. It’s almost like the sound bite that TV & radio so long for out of any important event - a very succinct message that captures the spirit of the event being reported, if not the truth.
For those who tend to turn to the back of the book to work out if the ending makes the whole book worth reading, a tweet can be quickly identified with the tags, no matter where they are in the message. You might miss something important, but the sender should have known to mark up their message with the tags in the first place!
As we make our communication more knowledge-aware - by using techniques like tagging, & other mark-ups that have been available since the web became a popular publication method (somewhat like this blog!), we are changing the way we use the language, & what we expect out of it. Let’s ignore for the time being the requirement for a richer character set to describe things ‘outside’ the language (like a ‘#’ for a tag, or a smiley face for a sentiment).
Our language is slowly changing with the medium - as always. Movable type made people more able to read. Phosphorescent type may make people more able to think.
Wise Up
Everything in between is just a matter of communication - you need to be able to pay attention to take in the wisdom of others, & you need to have presentation skills in order to convey a message. However, we are not born with these abilities. In these times of decreasing attention spans & reliance on technology over social interaction, you could almost say that we are even losing the basic skills for transferring wisdom.
This is how wisdom is quite distinct from knowledge, which is becoming increasingly easy to access - to the point where we no longer need to carry it around with us. We no longer need the facts in our head, but we need to know the method by which facts can be acquired - the index, the map, the web address. This is partly due to the sheer number of facts that we are expected to have, & partly to do with the accessibility of information in general (& connectability of people).
If we’ve got that much more information to play with, how do we learn how to use it? How do we learn how to get access to it (rather than being given an index)? How do we learn to discover the sources of facts in the way that we used to look at the natural world & discover facts?
Although this isn’t wisdom, as such, it is the ability to think for oneself, & this is a precursor to holding wisdom. It’s not enough, in itself, but it’s a start.
In today’s instant-gratification world, information is a commodity that can be acquired with little patience - it’s like fast food - an expected convenience. However, fast food is not particularly nutritious (useful to the body) & is often unhealthy (detrimental to the body). Many people shun fast food because they understand this. Many people “live” off it because they don’t. Information can be found by going to pick-your-favourite-search-engine & pressing a button. You are now informed. You are not well-informed; you could even be badly informed. Thank you, come again.
Wisdom is what gives you the understanding that a search engine is a fast food service.
In the same way, you don’t expect commercial television to teach you anything so earth-shatteringly useful that you will become successful & live a long, happy, & prosperous life (even if that’s the claim for the 6.30pm time slot each night). Why would you expect a search engine to slake your thirst for knowledge at the press of a button, just because it claims it has access to billions of results that may interest you.
If you read each of the billions of results, then that process will make you wise. Wisdom is that journey, or having taken such a journey, the ability to look at search results & determine if there is useful information on offer - like looking at the nutritional information on the packaging of fast food before eating it (trust me, it’s there).
Wisdom is not a drive-through experience. Wisdom is not the domain of couch potatoes.
Wisdom is not acquired through getting a University degree. That’s simply the equivalent of dining-in for fast food, or else watching the whole series of a sit-com, including ad breaks. It may sound like a big commitment, but that piece of paper at the end does not certify you as wise. I attended a very probing lecture by the Vice-Chancellor of Macquarie University last night, & the up-shot was that he believes that universities are there to educate, but students are there to learn. If the contract is broken by either party, then no-one is the wiser.
Once upon a time, very few people sought wisdom, & even fewer were sought as sources of wisdom. Such people were revered. They had generally acquired wisdom in their pursuit of knowledge, because that, in itself, was an arduous task that required perseverance & dedication. Anyone who got knowledge probably got wisdom, & was willing to share both. Now, the vast majority of people want knowledge (information), & yet everyone avoids wisdom. Universities, which once gathered knowledge wisely, now dispense information for a price (I had to avoid saying ‘cheaply’).
It’s ‘too difficult’ to acquire wisdom. It’s ‘not practical’, ‘not useful’, ‘irrelevant’ … I could put the inverted commas of sarcasm around many such phrases. If you see wisdom as unimportant, then you have, by definition, proven yourself to be in need of it.
It’s time for all of us - & yes, I include myself - to wise up.
"Dumb it down for me"
People make assumptions in what they ask of you. When someone wants a simple answer, they genuinely expect that there is one. There must always be one. Mankind has been seeking one for all of its sentient existence. Whether you are religious or not, the simple answer seems to be either “There is a God”, or “There is no God”. There’s a lot of in-betweenness missing from those two poles of opinion. The reality is that there is no simple answer. & yet, given that we’ve been contemplating this particular question for so long & still don’t have a clue, how is it that anyone can expect you to come up with a simple answer to a brand new question like “Want do you want to drink?” or “Why did the project fail?”
Yes, there is a large gap in the importance of the answer to those questions, but fundamentally, you probably don’t have an answer ready when the question is asked. I know that some people have a response that they regurgitate every time the question is asked, but that doesn’t mean it’s an answer to the question, nor does it mean it is well thought out. “Beer” & “lack of planning” deflect the questioner wonderfully, but they negate the possibility that just once you’d like to try a fluffy duck or review your unwieldy business processes.
If there is no simple answer to a given question, & someone expects a simple answer, then there is no effective way to answer them. A Zen Buddhist might answer with “mu”, in that the answer exceeds the parameters that the questioner provides.
If you give them a “dumbed down” version of the truth, then they will accept this simplification as the truth, rather than it being a representation of it. When someone says “There is no God” they generally don’t mean “It is physically impossible for a supreme being to have created what I perceive to be reality”, they actually mean “I don’t adhere to the existence of the god that you revere in your belief system” - & they’ve even taken the short-cut of assuming that they know what context you have - what you mean by “God”.
The whole idea of dumbing something down is, in itself, silly. It makes the person asking the question sound as though they can’t handle the truth or large concepts, & it makes the person answering switch contexts to try to match their own idea of what a simple answer might be, rather than explain what they know (no matter how complex).
Good communication is about putting your ideas to an audience. An audience that demands that you speak Greek is not receptive to your ideas in Italian; & yet, interpretive dance crosses language barriers wonderfully in conveying a meaning. That is, there is always a way to communicate effectively, but it doesn’t have to be through the mechanism that the audience demands or expects.
If the audience demands to be talked down to, or treated like an idiot, then it limits the communication that can occur. If an audience is open to doing some interpretation & having an interactive dialogue, then meaning can be transferred.
Good communicators find how to get their message to their audience. If the audience doesn’t understand, then it’s the communicator’s job to find the common ground & elaborate, or else find another mechanism to impart the message. Stopping the communicator from doing their job shows a disrespect for their ability as much as it does a disinterest in their message.
When an audience “requires” the message to be in a particular manner (a given language, or dumbing down), then they limit the nuances that they can understand. They detract from the content, the meaning, the ideas that need to be conveyed. If the audience takes that dumbed down message as the actual message, rather than a representation, then the new version of the message becomes truth.
This is much worse than someone interpreting communication from their own perspective. Not only have they twisted what they’ve heard, but they’ve twisted what was said!
Whether we’re talking about religion, a status report, or a conversation in a bar, dumbing down a message is … dumb.
Not Just
The word ‘just’ is actually two quite distinct words & meanings.
As an adjective (enhancing a noun), it comes from the same place as ‘justice’ - relating to truth, equality, being right - “a just decision”.
As an adverb (enhancing all sorts of things), it can mean ‘very close to’, in time, in precision, in fit to a circumstance - “just before noon”, “just what I wanted”, “just a little bit more”. If you like, it could be considered “almost, but not quite”. It does not mean “is the same as”.
Many people abuse the adverb by overlaying it with their expectations “it’s just a matter of …” - especially if they’re explaining something that they expect someone else to understand, when they themselves don’t quite grasp a concept. In a way, it’s a weasel word, which means it’s being used to mask something else inferred. In general, people who say “X is just like Y” actually don’t think X is anything like Y, but life would be much easier for all concerned if it was. It’s similar to “let’s pretend X is like Y, & see where the discussion leads - now assume X actually is Y & trust me”.
At the end of the day, X & Y aren’t any more the same for saying it.
Being a good communicator is all about saying what you really mean when you convey a message, not what you would like other people to think you believe. It’s easy enough to train yourself to recognise such weasel words as ‘just’, & then adjust your way of thinking. Change your words so that you can believe what you’re saying. The first step is to listen to yourself, & every time you hear yourself say “it’s just …”, then ask yourself “Hang on - is it?” Soon, you’ll be catching yourself well before you use ‘just’, & what you say will correlate with what you believe to be correct (just).
You could say that using ‘just’ badly is indeed unjust.
As an adjective (enhancing a noun), it comes from the same place as ‘justice’ - relating to truth, equality, being right - “a just decision”.
As an adverb (enhancing all sorts of things), it can mean ‘very close to’, in time, in precision, in fit to a circumstance - “just before noon”, “just what I wanted”, “just a little bit more”. If you like, it could be considered “almost, but not quite”. It does not mean “is the same as”.
Many people abuse the adverb by overlaying it with their expectations “it’s just a matter of …” - especially if they’re explaining something that they expect someone else to understand, when they themselves don’t quite grasp a concept. In a way, it’s a weasel word, which means it’s being used to mask something else inferred. In general, people who say “X is just like Y” actually don’t think X is anything like Y, but life would be much easier for all concerned if it was. It’s similar to “let’s pretend X is like Y, & see where the discussion leads - now assume X actually is Y & trust me”.
At the end of the day, X & Y aren’t any more the same for saying it.
Being a good communicator is all about saying what you really mean when you convey a message, not what you would like other people to think you believe. It’s easy enough to train yourself to recognise such weasel words as ‘just’, & then adjust your way of thinking. Change your words so that you can believe what you’re saying. The first step is to listen to yourself, & every time you hear yourself say “it’s just …”, then ask yourself “Hang on - is it?” Soon, you’ll be catching yourself well before you use ‘just’, & what you say will correlate with what you believe to be correct (just).
You could say that using ‘just’ badly is indeed unjust.
Pick Life
Nobody intentionally takes the difficult path - unless through some masochistic streak or in training for something unavoidable & even more difficult. It is not the case that everybody else is so stupid as to miss the easy opportunities. What people who use such phrases are saying is that they are jealous of a perception that someone else is having an easier time than them, & that they want a slice of it. They want a piece of that easy action, because they’re tired of working hard to achieve things. It’s unfair that others don’t have to work as hard. Note what I said at the beginning - these are the people who expect others to do the cherry picking.
In itself, the cherry metaphor is supposed to relate to how easy it is to pick cherries, on the basis that they have these nice long stems to grasp, well away from the fruit, & the fruit itself is firm enough to not crush easily. Then, of course, there’s the reward of getting to eat the cherries - & the fun of spitting out the stones. Going off cherry picking for a weekend on a friend’s farm sounds like an idyllic get-away. You just wouldn’t want to be a cherry picker for a living.
Similarly, the low-hanging fruit is in easy grasp of everyone who passes - they’ve all fondled it, the animals have nibbled at it, & everyone has rejected it, which is why it is still hanging. I’m not even sure if such fruit is necessarily ripe, if it hasn’t fallen from the tree. Perhaps that’s the intention of the metaphor - eat the unripe fruit, just because you are clever enough to look up, not down, where the ripe fruit is underfoot.
Life, unfortunately, is not like cherry-picking. Life, as George Bernard Shaw put it, wasn’t meant to be easy, but I leave it up to the diligent reader to find the rest of the line, which suggests that easiness should not be the goal in & of itself.
Life, however, is interesting, some might even say delightful, because of the challenges that come from realising that there are no low hanging fruit worth taking, & that cherry picking is a nice past-time for a change, but isn’t a lifestyle choice.
I don’t want to pick cherries. I pick life.
Just Say Sorry
There’s a lot to be said for apologising when you do something wrong - owning up to something that is your fault, but too often people say they’re sorry when they don’t mean it at all, it’s a knee-jerk reaction to a situation, a meaningless phrase that gives you something to say.
When I was working in a small office, phone calls for the CEO were routed to my desk, at which point I would say “I’m sorry, but she’s not here right now”, & my co-worker would twitter with glee “Don’t be sorry - it’s better when she’s not here”. It goes without saying that her acquisition of the English language did not come with the same pat phrases that mine did.
If you walk down a corridor & turn a blind corner to find someone almost on top of you, the first thing that pops into the mind is “I’m sorry”. What exactly are you sorry for? Having to share the corridor? For the architecture of the building that makes blind corners? Or that the corridor is so narrow that you would have to hug one side to avoid hitting people? As apologies go, it’s quite lame. It’s worse if both parties are sorry, like joining a mutual apologist society. There’s something to be sorry for.
Even when you did not hear someone, or think you may have misheard, “Sorry” can be anything from “I’m sorry, I wasn’t paying attention” to “I’m sorry, but you seem to be mumbling into your lunch” or “I’m sorry, but did you really say that QPR had made it to premier league?”
In Australia, there was a lot of trouble over saying sorry to the native population who endured years of government intervention into their private lives - such as removing children from parents on the basis that they could be brought up better by white people. The idea of saying sorry divided the nation, with half feeling that the Aborigines deserved an apology & the other half thinking that they personally had nothing to do with the policies or practices of the past generation. There are similar movements happening with regards the Japanese actions in World War II - no-one involved in decision-making is still around, & also very few of those directly affected.
However, what does it cost to say “Sorry”? Do you become a lesser person? Only if it is simple a phrase that you insert when you too befuddled to think of anything apt for the occasion. That’s the nub. As knee-jerk reaction phrases go, “Sorry” is one of the most common, inane, & meaningless, in general.
If “sorry” is a learned response to situations, then it can be unlearned through practice - stop yourself, think about what’s going on, & whether you really mean “sorry” at that point. If you don’t, & the other person doesn’t expect it, then try not to say it, & you’ll feel much better.
Another of my pet peeves is “How are you?” - which is prevalent in Australia, but not unique. There’s nothing worse than someone who knows the ritual so well that they say “How are you fine thanks.” That’s when you know that they’re not listening to you & have no interest in the response.
It’s right up there with “Bless you!” when someone sneezes, which is an English practice born from the expectation that your soul is escaping, or else you’re distracted enough for devils to sneak up your nose. If neither of these actually sounds likely, then saying “Bless you” is about as useful as hitting someone over the head with a sledge hammer & saying “Sorry”.
I seem to have gotten carried away there.
Sorry.
Communicating with Authority
Way back in the dark ages of this blog, I wrote about style, grammar,
technique - the nuts & bolts of “good” writing. There are plenty of
references out there - the authorities that are always given to support
a particular opinion on any topic in writing or speaking. Right there, I
have blown my cover - opinion. I have an opinion, each of these
authorities has an opinion. Everyone has an opinion on those
authorities, & we all go around touting our particular brand of
bigotry.
There is no “right” way to communicate. There is only an effective way, & that is dependent on you & your audience. It is totally subjective. There is no authority to step in & say that the “standard” way must work because everyone has been trained to speak or write in that way & everyone who reads or listens expects it. That simply is not the case. I write in English, as I was taught to do. I listen to people from the stand-point of someone who grew up listening to people speaking English & then was dumped in an environment with people who had not been so brought up (called real life).
Everyone makes their best efforts. If they don’t, then they seem to not want to communicate. People learn their own techniques for being more effective at getting their message across - whether it’s speaking more slowly, always nodding or interjecting with “yes” as they listen, watching the lips move, or running their finger along the line as they read. None of these things are “correct” or “standard”, because standards are made for people who are proficient, not for people who need to & want to continually improve. In fact, sometimes standards are made arbitrarily by people who think things should be a certain way with no regard to how they actually are - etiquette being a classic example.
No-one is so proficient in English that they can’t improve. Therefore, there is no standard, as there is no-one with enough authority to set it. There’s a big statement. We have been in the situation for quite some time now where the Royal Family - the Monarch in particular - has been quite proficient in English (don’t look at me like that, George I couldn’t speak it at all), but that doesn’t make them authorities. There is no “Queen’s English” - especially not in Scotland (where they speak Scots, generally considered near enough to English), & probably not in the US (where they have a thing against Queens).
The average person who was brought up in an English-speaking community gets quite frustrated with those who weren’t - & even more so when faced with someone from a distinct English-speaking community from a different part of the world or from a different ethnic origin (that is, has a different cultural heritage with distinct vocabulary). The standard idea of a red-neck is someone who thinks that the way that they were taught is the one true way, & everyone else is wrong - whether we’re talking about communication skills, religion, or how to crack an egg (with apologies to Dr Swift). This happens with other groups, but “native” English speakers stand out because of the sheer number of different “native”-nesses & “non-native” speakers.
As pointed out earlier, as English becomes a de facto world language, it becomes less like the English that gets taught. Only a red neck believes that his version should be the world standard. Let’s not even start on accents.
But I should try to make a point here … most books on improving communication skills seem to be about teaching techniques that make the communicator more comfortable & a master of the art, giving them a false sense of security in their own new abilities - on the basis that they are learning from an authority. What if people started doing the opposite?
What if everyone assumed that they were a poor communicator, & made every effort to ensure that their message was understood by their audience, making minimal assumptions about shared background or vocabulary, & actually struggled, intentionally, to labour each point. This would be more likely to guarantee the success of the communication, even if it guaranteed a certain level of annoyance (in both parties). But if everyone does it, & it becomes the standard technique, then everyone will expect it, & everyone will accept that that is the best way to communicate.
This breaks down barriers - not just in communication, but also in culture. If you assume that there are no barriers, then they won’t be broken down, they will be ignored, & you can happily shout over the wall at each other & say “What?” every time a gust of wind blows your words away. But if you chip away at the wall, you get holes through which you can gain a much better understanding of what the other person’s background & environment are, & you can better tailor your communication to suit them, & thus become a more effective communicator.
It is only by being humble that we can truly communicate with authority.
There is no “right” way to communicate. There is only an effective way, & that is dependent on you & your audience. It is totally subjective. There is no authority to step in & say that the “standard” way must work because everyone has been trained to speak or write in that way & everyone who reads or listens expects it. That simply is not the case. I write in English, as I was taught to do. I listen to people from the stand-point of someone who grew up listening to people speaking English & then was dumped in an environment with people who had not been so brought up (called real life).
Everyone makes their best efforts. If they don’t, then they seem to not want to communicate. People learn their own techniques for being more effective at getting their message across - whether it’s speaking more slowly, always nodding or interjecting with “yes” as they listen, watching the lips move, or running their finger along the line as they read. None of these things are “correct” or “standard”, because standards are made for people who are proficient, not for people who need to & want to continually improve. In fact, sometimes standards are made arbitrarily by people who think things should be a certain way with no regard to how they actually are - etiquette being a classic example.
No-one is so proficient in English that they can’t improve. Therefore, there is no standard, as there is no-one with enough authority to set it. There’s a big statement. We have been in the situation for quite some time now where the Royal Family - the Monarch in particular - has been quite proficient in English (don’t look at me like that, George I couldn’t speak it at all), but that doesn’t make them authorities. There is no “Queen’s English” - especially not in Scotland (where they speak Scots, generally considered near enough to English), & probably not in the US (where they have a thing against Queens).
The average person who was brought up in an English-speaking community gets quite frustrated with those who weren’t - & even more so when faced with someone from a distinct English-speaking community from a different part of the world or from a different ethnic origin (that is, has a different cultural heritage with distinct vocabulary). The standard idea of a red-neck is someone who thinks that the way that they were taught is the one true way, & everyone else is wrong - whether we’re talking about communication skills, religion, or how to crack an egg (with apologies to Dr Swift). This happens with other groups, but “native” English speakers stand out because of the sheer number of different “native”-nesses & “non-native” speakers.
As pointed out earlier, as English becomes a de facto world language, it becomes less like the English that gets taught. Only a red neck believes that his version should be the world standard. Let’s not even start on accents.
But I should try to make a point here … most books on improving communication skills seem to be about teaching techniques that make the communicator more comfortable & a master of the art, giving them a false sense of security in their own new abilities - on the basis that they are learning from an authority. What if people started doing the opposite?
What if everyone assumed that they were a poor communicator, & made every effort to ensure that their message was understood by their audience, making minimal assumptions about shared background or vocabulary, & actually struggled, intentionally, to labour each point. This would be more likely to guarantee the success of the communication, even if it guaranteed a certain level of annoyance (in both parties). But if everyone does it, & it becomes the standard technique, then everyone will expect it, & everyone will accept that that is the best way to communicate.
This breaks down barriers - not just in communication, but also in culture. If you assume that there are no barriers, then they won’t be broken down, they will be ignored, & you can happily shout over the wall at each other & say “What?” every time a gust of wind blows your words away. But if you chip away at the wall, you get holes through which you can gain a much better understanding of what the other person’s background & environment are, & you can better tailor your communication to suit them, & thus become a more effective communicator.
It is only by being humble that we can truly communicate with authority.
Killer
In IT, people talk about a “killer app” - a software product so
wonderful that it will revolutionise the industry & make
billionaires out of the people responsible for it - or at least the
people who invest in it. Similarly, when making a speech, you might want
to “slay” the audience or the opposition - apparently these are both
good things - due to the excellence of the content or the delivery.
For most people, surviving the ordeal of public speaking, or even writing an internal memo or email, is enough. They don’t set out to conquer the world with their words. & yet, the aim of many “self-help” books is mastery - mastering your fears, your audience, your destiny, your compulsions, your inabilities. If it was so easy to help yourself achieve greatness, why hadn’t you done it yet?
How about simply being good enough? How about being able to communicate in such a way that you were satisfied that your message had been conveyed - which is as much about garnering the feedback of your audience as it is about preparation & presentation. It’s about closing the loop. It’s not about leaving the room as the sole survivor (having slain them all).
Communicating is a creative process, not a destructive one. We want to nurture the relationship we are building through conversation (written or spoken). An argument, with a winner, is not a form of communication, as such, in the more commonly-used modern form. Strangely, the original (Greek) intention was certainly to be having a conversation - even a discussion - that may convey points of differing opinion to be used to better reach a decision. I am reasonably certain that the “loser” of the discussion was not ritually disemboweled (which sounds more Roman than Greek).
The result of a well-crafted communication should bring a warm & fuzzy feeling of a job well done, not the warmth of fresh blood on the hands.
For most people, surviving the ordeal of public speaking, or even writing an internal memo or email, is enough. They don’t set out to conquer the world with their words. & yet, the aim of many “self-help” books is mastery - mastering your fears, your audience, your destiny, your compulsions, your inabilities. If it was so easy to help yourself achieve greatness, why hadn’t you done it yet?
How about simply being good enough? How about being able to communicate in such a way that you were satisfied that your message had been conveyed - which is as much about garnering the feedback of your audience as it is about preparation & presentation. It’s about closing the loop. It’s not about leaving the room as the sole survivor (having slain them all).
Communicating is a creative process, not a destructive one. We want to nurture the relationship we are building through conversation (written or spoken). An argument, with a winner, is not a form of communication, as such, in the more commonly-used modern form. Strangely, the original (Greek) intention was certainly to be having a conversation - even a discussion - that may convey points of differing opinion to be used to better reach a decision. I am reasonably certain that the “loser” of the discussion was not ritually disemboweled (which sounds more Roman than Greek).
The result of a well-crafted communication should bring a warm & fuzzy feeling of a job well done, not the warmth of fresh blood on the hands.
I'll Assume You'll Find This Interesting
Sometimes when I’m sitting at the computer, I will play a game called
Freecell - this may come as a shock to some of you that I play games,
but to others it will merely confirm your suspicions.
The game itself is not particularly taxing - it involves moving cards around in stacks until you can remove them to other stacks - but it has a large number of possible permutations (that is, it isn’t boring), & has one redeeming feature - there is always a solution to the logic puzzle it presents. Given this fact, it must be possible for me to find that solution. The general approach to the solution is to think “to remove that first card I want, I have to move this card, which means I have to move this card, …”
Some would call this a ‘logical’ approach to solving the problem. More often than not, it doesn’t work, & the tendency is to back-track on the logic to work out where I got the logic wrong. This is time consuming. It is often wasteful in terms of solving the problem. It is also the way that most of us were ‘trained’ - if we were trained at all - to approach such problems. We were taught to break down a difficult problem into simpler steps & solve each of these in the order we ‘know’ reaches the solution. However, can we ‘prove’ that this will solve the problem? It sounds ‘logical’.
For those who have been paying attention to the inverted commas, & understand one of the many uses of same, you will see that ‘logic’, as I’m highlighting, implies something that I’m not convinced is the right term here, or at least the right application of thinking. It’s not the right methodology, the appropriate approach, … & it’s very difficult to explain something that is an alternative that can still be considered correct in achieving a goal that appears to be a logic puzzle solution.
However, de Bono coined the term “Lateral Thinking” quite a long time ago, & it still hasn’t filtered into our speech sufficiently to see it as a kind of logic beyond logic, an approach to solving logic problems that is different, not ‘logical’ in the sense that we were taught.
My approach to solving Freecell is to think in terms of having made an assumption that was wrong in my logic, & therefore the moves that I made based on that assumption would not lead to a solution. How can we get away from false assumptions? Lateral Thinking. How do I solve the logic puzzle when I seem ‘stuck’ - I start from scratch & remove every assumption I’d made before.
This may sound like I’m diving into the problem blind, with no hope of solving the logic puzzle, but this works because logic puzzles are about patterns, not process. The brain will find the patterns (because that’s what brains are really good at), without needing to understand the process of solving the puzzle. When you’re playing a game on a computer, you don’t have to tell the computer how you solved the puzzle. You’re allowed to just solve it & move on to the next problem.
You’re wondering by now where this is all leading…
Communication is a logic puzzle. When you want to express something to someone, you have a goal (getting your message across), & you have a process in mind (sentence constructions, ideas, language), & you apply a logical process to convey your message. What happens if it doesn’t work? What happens if your audience is still totally dumbfounded?
You have to accept at that point that you’ve made assumptions about your audience - whether it’s a commonality of background knowledge, language, your communication skills, or even whether they’ve got perfect hearing in that ear, you’ve made an assumption. You could take a logical approach & remove these assumptions one by one, or you could take a lateral thinking approach & scrap all assumptions & try a new way of communicating entirely.
The new way may be through using metaphors, speaking more slowly, or with different emphasis on key points, or even asking questions to elicit the audience’s level of understanding, so that you can remove assumptions. These are all a part of good communication. They solve the puzzle of getting your message across.
Of course, if you’ve got no feedback mechanism from the audience, then you are somewhat stymied. If you don’t know when you’ve solved the puzzle, then it’s particularly hard to know when to stop. In this way, we come full circle, & relate back to this particular blog & how, time & again, I wonder if I’m making an assumption that any of this is of any interest to anyone.
The game itself is not particularly taxing - it involves moving cards around in stacks until you can remove them to other stacks - but it has a large number of possible permutations (that is, it isn’t boring), & has one redeeming feature - there is always a solution to the logic puzzle it presents. Given this fact, it must be possible for me to find that solution. The general approach to the solution is to think “to remove that first card I want, I have to move this card, which means I have to move this card, …”
Some would call this a ‘logical’ approach to solving the problem. More often than not, it doesn’t work, & the tendency is to back-track on the logic to work out where I got the logic wrong. This is time consuming. It is often wasteful in terms of solving the problem. It is also the way that most of us were ‘trained’ - if we were trained at all - to approach such problems. We were taught to break down a difficult problem into simpler steps & solve each of these in the order we ‘know’ reaches the solution. However, can we ‘prove’ that this will solve the problem? It sounds ‘logical’.
For those who have been paying attention to the inverted commas, & understand one of the many uses of same, you will see that ‘logic’, as I’m highlighting, implies something that I’m not convinced is the right term here, or at least the right application of thinking. It’s not the right methodology, the appropriate approach, … & it’s very difficult to explain something that is an alternative that can still be considered correct in achieving a goal that appears to be a logic puzzle solution.
However, de Bono coined the term “Lateral Thinking” quite a long time ago, & it still hasn’t filtered into our speech sufficiently to see it as a kind of logic beyond logic, an approach to solving logic problems that is different, not ‘logical’ in the sense that we were taught.
My approach to solving Freecell is to think in terms of having made an assumption that was wrong in my logic, & therefore the moves that I made based on that assumption would not lead to a solution. How can we get away from false assumptions? Lateral Thinking. How do I solve the logic puzzle when I seem ‘stuck’ - I start from scratch & remove every assumption I’d made before.
This may sound like I’m diving into the problem blind, with no hope of solving the logic puzzle, but this works because logic puzzles are about patterns, not process. The brain will find the patterns (because that’s what brains are really good at), without needing to understand the process of solving the puzzle. When you’re playing a game on a computer, you don’t have to tell the computer how you solved the puzzle. You’re allowed to just solve it & move on to the next problem.
You’re wondering by now where this is all leading…
Communication is a logic puzzle. When you want to express something to someone, you have a goal (getting your message across), & you have a process in mind (sentence constructions, ideas, language), & you apply a logical process to convey your message. What happens if it doesn’t work? What happens if your audience is still totally dumbfounded?
You have to accept at that point that you’ve made assumptions about your audience - whether it’s a commonality of background knowledge, language, your communication skills, or even whether they’ve got perfect hearing in that ear, you’ve made an assumption. You could take a logical approach & remove these assumptions one by one, or you could take a lateral thinking approach & scrap all assumptions & try a new way of communicating entirely.
The new way may be through using metaphors, speaking more slowly, or with different emphasis on key points, or even asking questions to elicit the audience’s level of understanding, so that you can remove assumptions. These are all a part of good communication. They solve the puzzle of getting your message across.
Of course, if you’ve got no feedback mechanism from the audience, then you are somewhat stymied. If you don’t know when you’ve solved the puzzle, then it’s particularly hard to know when to stop. In this way, we come full circle, & relate back to this particular blog & how, time & again, I wonder if I’m making an assumption that any of this is of any interest to anyone.
Sell & Tell
The main point of communicating (I told you I’d be back on the old
topic) is to get a point across to an audience. If the point you’re
trying to make is uninteresting, then you probably won’t have an
audience, so there’s little point in making it, because you’re probably
wasting your time as much as that of any passer-by who happens to be
caught in the maelstrom of your communicative attempt. This applies
equally to standing on a soap box & writing a blog.
If you are a deaf & blind man preaching from the pulpit, then you are probably doing so for your own benefit. Writing a blog is like that. I could see this act as cathartic, or else an attempt at linguistic self-gratification (& yes, I could have used another word there, but declined). Without an audience, I’m the only one getting anything out of it.
You can create an audience. Sometimes you can start without one & attract attention to your communication by making the content interesting or directing the content at people who might be interested. Don’t talk about social reform to the local conservative party hack, take it to a group of communists for further discussion - if that is what you want to talk about. In general, there is too much communication pollution, & we waste resources creating communication that we just don’t use, & the internet is becoming just a big landfill where truckloads of rubbish are piled on top of yesterdays leftovers, with no-one having the time or energy to sort out the recyclable stuff from that which is best left to find its own half-life.
If you don’t want your words to be relegated to such a scrap-heap, then you need to find your audience. In fact, you need to sell not just the words themselves, but a desire to hear the words. You have to create the audience, sometimes, because it may not be there yet. No-one may give a toss about the plight of the spotted dik-dik except for a few very strange people who go on about God’s creatures yet quite happily go home to cook a piece of cow for dinner. What you need is a growing crowd of supporters to hang on your every word describing the declining fortunes of the species due to a combination of environmental changes, human intervention, hunting, poaching, & inconsiderate tourists.
You have to make people love to hear your words. You have to have them hanging there in suspense, waiting for an update on little Albert, the new-born dik-dik, & his struggles for stability on four legs & being left alone to fend for himself while his mother goes out foraging increasingly farther afield as the drought sets in.
Now you’ve got a story to tell. Now is the time to concentrate on what you’re saying - how you’ll keep that audience so carefully brought to your bosom, & nurture it, & understand its needs. The audience wants you, & you can’t let them down. You’ve sold them the idea of the story, & they’ve bought it, so now it’s time to deliver.
The big question is, do you drop the idea of telling the story if nobody cares? Plenty of people do. The majority of academic papers are produced on the basis that there’s a conference that needs to be submitted to, & an audience that can only be gotten to through selling an idea to someone who probably has no interest in your content. Such critics, gate-keepers, supposedly represent the tastes of the audience that you want, in the same way that a restaurant critic may be barring you from a full house, or a movie critic stopping your latest production’s survival at the cinema. Sometimes you have to sell to these people first, & they will on-sell. They are communication middle-men. They don’t necessarily add value to your content, but they do provide distribution to the audience.
If you start thinking of them as the retailers to your wholesale communication product, then you’ll probably go batty trying to work out who pays the GST, so its best to stop going down that metaphor.
Critical success factors:
I do hope you enjoyed this installment, as I promise there will be many more & I can only get better at it. Past copies are still available in the recycle centre that is this particular blog, but I am well aware that yesterday’s news wraps today’s fish.
If you are a deaf & blind man preaching from the pulpit, then you are probably doing so for your own benefit. Writing a blog is like that. I could see this act as cathartic, or else an attempt at linguistic self-gratification (& yes, I could have used another word there, but declined). Without an audience, I’m the only one getting anything out of it.
You can create an audience. Sometimes you can start without one & attract attention to your communication by making the content interesting or directing the content at people who might be interested. Don’t talk about social reform to the local conservative party hack, take it to a group of communists for further discussion - if that is what you want to talk about. In general, there is too much communication pollution, & we waste resources creating communication that we just don’t use, & the internet is becoming just a big landfill where truckloads of rubbish are piled on top of yesterdays leftovers, with no-one having the time or energy to sort out the recyclable stuff from that which is best left to find its own half-life.
If you don’t want your words to be relegated to such a scrap-heap, then you need to find your audience. In fact, you need to sell not just the words themselves, but a desire to hear the words. You have to create the audience, sometimes, because it may not be there yet. No-one may give a toss about the plight of the spotted dik-dik except for a few very strange people who go on about God’s creatures yet quite happily go home to cook a piece of cow for dinner. What you need is a growing crowd of supporters to hang on your every word describing the declining fortunes of the species due to a combination of environmental changes, human intervention, hunting, poaching, & inconsiderate tourists.
You have to make people love to hear your words. You have to have them hanging there in suspense, waiting for an update on little Albert, the new-born dik-dik, & his struggles for stability on four legs & being left alone to fend for himself while his mother goes out foraging increasingly farther afield as the drought sets in.
Now you’ve got a story to tell. Now is the time to concentrate on what you’re saying - how you’ll keep that audience so carefully brought to your bosom, & nurture it, & understand its needs. The audience wants you, & you can’t let them down. You’ve sold them the idea of the story, & they’ve bought it, so now it’s time to deliver.
The big question is, do you drop the idea of telling the story if nobody cares? Plenty of people do. The majority of academic papers are produced on the basis that there’s a conference that needs to be submitted to, & an audience that can only be gotten to through selling an idea to someone who probably has no interest in your content. Such critics, gate-keepers, supposedly represent the tastes of the audience that you want, in the same way that a restaurant critic may be barring you from a full house, or a movie critic stopping your latest production’s survival at the cinema. Sometimes you have to sell to these people first, & they will on-sell. They are communication middle-men. They don’t necessarily add value to your content, but they do provide distribution to the audience.
If you start thinking of them as the retailers to your wholesale communication product, then you’ll probably go batty trying to work out who pays the GST, so its best to stop going down that metaphor.
Critical success factors:
- have a story worth telling
- find an audience for the story
- thrill your audience
I do hope you enjoyed this installment, as I promise there will be many more & I can only get better at it. Past copies are still available in the recycle centre that is this particular blog, but I am well aware that yesterday’s news wraps today’s fish.
Inalienable
A much heavier topic than communication, I’d like to take an aside to
talk about humanity, society, freedom, etc. This is based on my current
readings over the last few days - Rev Kirby Hensley’s dreams of the
separation of church & state, & a revisit to Ben Elton’s Stark. In some strange Jonathan-Creek-esque thought process, it all suddenly made sense.
At first, these two authors may appear unrelated - Hensley often talks of the inalienable rights of a person - security, food, procreation (in that order) - & how governments have never been particularly good at ensuring these for all of the people all of the time. As an American, he views their Constitution as somewhat excessive if it doesn’t have these rights exclusively. Once upon a time, when the state was not so good at running things, the one true church would fill the gap & ensure that responsibilities were met. With the current proliferation of schisms, there is no one true church, therefore its attention has turned to competing for the right to claim the role. Hensley’s core religion is Humanist.
Similarly, government, which used to be a relatively stable platform of power, has turned into a competition for being in government. Isn’t democracy wonderful? If the majority of the energy of those elected is poured into staying elected, then they’re not the government. With a weak church, who is pointing that out? Even the bicameral system that was intended to hold checks & balances works on the same principal of oneupmanship to ensure that the talons of some mysterious force referred to as a political party firmly grip power.
Now, throw into this mix a few thoughts from someone who should be taken more seriously - Ben Elton suggested that those who are making the decisions concerning people’s everyday lives are actually in power, & that the world would fall into chaos without them. He is referring to the producers - those who own the methods of production & effectively dictate what is made available to the consumer. These people have power, yet have no responsibility. They don’t get elected - that would be silly. They have no fear of losing power through popular vote (in general). They are a stable force.
These people should form the core of the oversight government. They wouldn’t dare collude, because they are by definition competitive, & if this group is small relative to the body of candidates, then this oligarchic system will have its own checks. They have too much to lose to be corrupt during this sideshow of government. They have their livelihoods at stake.
The average politician is effectively unemployable. Some senior people go back into business, mostly as figureheads, but few work to the ideal of representing their fellow constituents in the parliamentary proceedings & staying grounded enough to do so over any length of time before getting to their use-by date & returning to their past lives. Some systems of government do work like that.
The British government often uses ‘outsiders’ - non-politicians - in the cabinet. The Swiss militia system, for both the army & the legislature, seems to work quite nicely for them. Countries as diverse as Rwanda & New Zealand have assured representation from women or native populations. These are just some examples whereby non-traditional, or non-professional politicians are given involvement in the process of government.
Fundamentally, of course, ‘government’, being no longer absolute, is simply the right to set policy & oversee those who implement it - that is, the public service. As such, the dual roles of thought leadership & implementation oversight don’t need to be in the same place. Having two houses within the government where one watches over the other watching over the public service seems somewhat pointless except as another indication that the system doesn’t work. In many countries, non-elected professional ombudsmen also oversee the public service (& other industries) as a proxy for the people.
This is why so many ‘ordinary’ people complain of the many layers & inefficiency of government. The reality is that the system is inherently complex because of its need to appear democratic & corruption-free. It needs to justify its own existence. But simplification takes away that veneer. The simpler the system, the easier it is to make it go wrong. The more complex, the harder it is to make it do anything.
There is no inalienable right to good government. It is the responsibility of the society to govern itself. A mature society does this democratically (not necessarily representationally). An immature one should be able to rely on something beyond state - such as the church or a non-absolute monarchy - to ensure some semblance of social responsibility within the government.
I only hope this diversion from my usual content doesn’t alienate me from my usual readership.
At first, these two authors may appear unrelated - Hensley often talks of the inalienable rights of a person - security, food, procreation (in that order) - & how governments have never been particularly good at ensuring these for all of the people all of the time. As an American, he views their Constitution as somewhat excessive if it doesn’t have these rights exclusively. Once upon a time, when the state was not so good at running things, the one true church would fill the gap & ensure that responsibilities were met. With the current proliferation of schisms, there is no one true church, therefore its attention has turned to competing for the right to claim the role. Hensley’s core religion is Humanist.
Similarly, government, which used to be a relatively stable platform of power, has turned into a competition for being in government. Isn’t democracy wonderful? If the majority of the energy of those elected is poured into staying elected, then they’re not the government. With a weak church, who is pointing that out? Even the bicameral system that was intended to hold checks & balances works on the same principal of oneupmanship to ensure that the talons of some mysterious force referred to as a political party firmly grip power.
Now, throw into this mix a few thoughts from someone who should be taken more seriously - Ben Elton suggested that those who are making the decisions concerning people’s everyday lives are actually in power, & that the world would fall into chaos without them. He is referring to the producers - those who own the methods of production & effectively dictate what is made available to the consumer. These people have power, yet have no responsibility. They don’t get elected - that would be silly. They have no fear of losing power through popular vote (in general). They are a stable force.
These people should form the core of the oversight government. They wouldn’t dare collude, because they are by definition competitive, & if this group is small relative to the body of candidates, then this oligarchic system will have its own checks. They have too much to lose to be corrupt during this sideshow of government. They have their livelihoods at stake.
The average politician is effectively unemployable. Some senior people go back into business, mostly as figureheads, but few work to the ideal of representing their fellow constituents in the parliamentary proceedings & staying grounded enough to do so over any length of time before getting to their use-by date & returning to their past lives. Some systems of government do work like that.
The British government often uses ‘outsiders’ - non-politicians - in the cabinet. The Swiss militia system, for both the army & the legislature, seems to work quite nicely for them. Countries as diverse as Rwanda & New Zealand have assured representation from women or native populations. These are just some examples whereby non-traditional, or non-professional politicians are given involvement in the process of government.
Fundamentally, of course, ‘government’, being no longer absolute, is simply the right to set policy & oversee those who implement it - that is, the public service. As such, the dual roles of thought leadership & implementation oversight don’t need to be in the same place. Having two houses within the government where one watches over the other watching over the public service seems somewhat pointless except as another indication that the system doesn’t work. In many countries, non-elected professional ombudsmen also oversee the public service (& other industries) as a proxy for the people.
This is why so many ‘ordinary’ people complain of the many layers & inefficiency of government. The reality is that the system is inherently complex because of its need to appear democratic & corruption-free. It needs to justify its own existence. But simplification takes away that veneer. The simpler the system, the easier it is to make it go wrong. The more complex, the harder it is to make it do anything.
There is no inalienable right to good government. It is the responsibility of the society to govern itself. A mature society does this democratically (not necessarily representationally). An immature one should be able to rely on something beyond state - such as the church or a non-absolute monarchy - to ensure some semblance of social responsibility within the government.
I only hope this diversion from my usual content doesn’t alienate me from my usual readership.
Talk to the Hand
Communication is about getting your message across &
understanding the message or intent of the person with whom you are
conversing. Simple enough. Although I tend to concentrate on the words
& their meaning, this is only a small part of communication (as I
keep stating) & the things that aren’t words are harder to express
using them, which is why we use them & need them!
A classic example is hand waving. Some people wave their hands about more than others. I believe that women tend to be more expressive with their hands. Politicians are very careful about the message that they convey with their hand movements - they have to be powerful without being threatening (except in parliament). TV presenters & advertorialists are very conscious of their hands & making them seem ‘natural’ when the hands often have nothing to say. In this case, the copy-writers haven’t included the hands in the message to be conveyed, so it’s obviously quite hard to speak using only one of the available methods of communication. It’s the equivalent of assuming that the audience only have black & white TVs. To some extent, copy-writers may still think that they’re working for radio …
I digress.
The problem with talking with hands is that there is no shared language. There is no grammar, no dictionary, very few classes for learning the pronunciation - although people like Alan Pease do talk of body language in particular & teach the reading of it. We ‘assume’ that we understand other people, when we’re really applying our own cultural background to their actions. It’s somewhat like two dialects of a language where the nouns are not only different, but often interchanged (like Strine & American English).
Sometimes, a hand movement has meaning that is universal “It’s over there”, “It was THIS big” (note my resorting to capitals as emphasis to indicate tonal change). But at other times, hand movements are distinctly not universal. The European style of “come here” using the whole hand is quite rude to the Japanese (palm upwards), but the Eastern style (palm downwards) is very similar to the English “go away” or “shoo!”. One infamous American President once tried to use a “V for Victory” sign from his cavalcade in London & had it around the wrong way, which is quite distinctly “up yours”. Oops.
Of course, obscure hand motions & actions are useful when you want to insult someone without their realising it - that is, for the benefit of your friends, or else to let off steam. Biting the end of the thumb means very little to most people outside of southern Europe & Shakespearean England. This makes for private jokes, but is not all that useful for conveying the message to its intended recipient - you may as well write them a nasty note in Inuktitut (as used for Inuit - by some Eskimoes).
Bad hand movements can be a distraction if you’re not careful - if your hands are moving radically around, yet are unrelated to your conversation, should the other person be watching your hands or the expression on your face, which is far more likely to be in tune with what you want to express? Don’t take their concentration away unless you are performing some form of linguistic prestidigitation, or else, “Hey Presto”, the point you were trying to make will disappear without a trace.
Next time when you’ve got someone backed up against a wall while you’re talking at them, & your fingers are making threatening actions towards them, spare a thought for how much index fingers look like knives, & whether your meaning is intended to cut them. Even sweeping hand strokes can brush people away.
Or if you’re someone who uses all fingers & the palm of the hand to gently pat the person they’re talking to, could it be interpretted as an affront - an invasion of space - rather than the reaching out to the audience that you intend it to be?
You’ve been speaking this language for some time, & it is definitely time that you learned what it is that you’ve been saying.
A classic example is hand waving. Some people wave their hands about more than others. I believe that women tend to be more expressive with their hands. Politicians are very careful about the message that they convey with their hand movements - they have to be powerful without being threatening (except in parliament). TV presenters & advertorialists are very conscious of their hands & making them seem ‘natural’ when the hands often have nothing to say. In this case, the copy-writers haven’t included the hands in the message to be conveyed, so it’s obviously quite hard to speak using only one of the available methods of communication. It’s the equivalent of assuming that the audience only have black & white TVs. To some extent, copy-writers may still think that they’re working for radio …
I digress.
The problem with talking with hands is that there is no shared language. There is no grammar, no dictionary, very few classes for learning the pronunciation - although people like Alan Pease do talk of body language in particular & teach the reading of it. We ‘assume’ that we understand other people, when we’re really applying our own cultural background to their actions. It’s somewhat like two dialects of a language where the nouns are not only different, but often interchanged (like Strine & American English).
Sometimes, a hand movement has meaning that is universal “It’s over there”, “It was THIS big” (note my resorting to capitals as emphasis to indicate tonal change). But at other times, hand movements are distinctly not universal. The European style of “come here” using the whole hand is quite rude to the Japanese (palm upwards), but the Eastern style (palm downwards) is very similar to the English “go away” or “shoo!”. One infamous American President once tried to use a “V for Victory” sign from his cavalcade in London & had it around the wrong way, which is quite distinctly “up yours”. Oops.
Of course, obscure hand motions & actions are useful when you want to insult someone without their realising it - that is, for the benefit of your friends, or else to let off steam. Biting the end of the thumb means very little to most people outside of southern Europe & Shakespearean England. This makes for private jokes, but is not all that useful for conveying the message to its intended recipient - you may as well write them a nasty note in Inuktitut (as used for Inuit - by some Eskimoes).
Bad hand movements can be a distraction if you’re not careful - if your hands are moving radically around, yet are unrelated to your conversation, should the other person be watching your hands or the expression on your face, which is far more likely to be in tune with what you want to express? Don’t take their concentration away unless you are performing some form of linguistic prestidigitation, or else, “Hey Presto”, the point you were trying to make will disappear without a trace.
Next time when you’ve got someone backed up against a wall while you’re talking at them, & your fingers are making threatening actions towards them, spare a thought for how much index fingers look like knives, & whether your meaning is intended to cut them. Even sweeping hand strokes can brush people away.
Or if you’re someone who uses all fingers & the palm of the hand to gently pat the person they’re talking to, could it be interpretted as an affront - an invasion of space - rather than the reaching out to the audience that you intend it to be?
You’ve been speaking this language for some time, & it is definitely time that you learned what it is that you’ve been saying.
Are We There Yet?
So much can be said about us with what we say & how we say it.
When a child says “Are we there yet?” - assuming that they do it in context & are not being ironic - they may have no interest in ‘there’. In fact, they may despise ‘there’, but ‘anywhere’ is better than ‘here’ - which is being stuck inside a mode of transport without sufficient distraction. The real question the child is trying to ask is “Can you guess how bored I am?” & they even give you a clue. They need attention or a distraction.
This says a lot about the child - if you’re the parent, you have a basis from which to form an action - amuse the child; if you’re not the parent, you can roll your eyes (but that rarely gives long-term satisfaction).
A parent will sympathise with the child. A non-parent may sympathise with the parent.
However, adults say it, too.
How many times (for those who work for or with other people) have you heard a boss - or even a partner - ask “Have you finished yet?” on a task for which you have responsibility & control, but where they have some vested interest in the outcome? Are they asking out of genuine concern for your welfare? Sometimes. Are they being critical, implying that if they had been doing the job it would already have been finished? Quite likely. Are they bored? Hmmm … maybe, or else under stress because someone is depending on them to deliver or complete the next task.
There are admittedly people who always believe that they could do any job better than anyone else - the kind who, as Ben Elton puts it, would stand behind Van Gogh just itching to grab the brush & finish off the picture. These people are a little bit arrogant, but mostly impatient. They are not bad people, but they are bad communicators. Van Gogh communicated with colour & imagery. Taking the brush off him is like sticking your hand over someone’s mouth & finishing their sentence for them.
Do we sympathise with such people? Or do we tell them to (politely) get a life or go bother someone else? What does this say about you?
Do you communicate with someone when there’s a shared task, or simply communicate at them?
The language that we use is a small part of how we communicate, as tone, physical actions (body language), speech patterns, attention, etc, all contribute to the message that we share with our interlocutor.
Communication skill acquisition is a learning process that we all have to go through. If we don’t get the skills, then we’ll spend our lives like children trapped in the back seat of the vehicle, not caring where we’re going, only knowing that it’s taking a long time to get there. The adults in the front seat, discussing navigation, leave us out of the conversation until a voice pipes up for attention.
“Are we there yet?”
When a child says “Are we there yet?” - assuming that they do it in context & are not being ironic - they may have no interest in ‘there’. In fact, they may despise ‘there’, but ‘anywhere’ is better than ‘here’ - which is being stuck inside a mode of transport without sufficient distraction. The real question the child is trying to ask is “Can you guess how bored I am?” & they even give you a clue. They need attention or a distraction.
This says a lot about the child - if you’re the parent, you have a basis from which to form an action - amuse the child; if you’re not the parent, you can roll your eyes (but that rarely gives long-term satisfaction).
A parent will sympathise with the child. A non-parent may sympathise with the parent.
However, adults say it, too.
How many times (for those who work for or with other people) have you heard a boss - or even a partner - ask “Have you finished yet?” on a task for which you have responsibility & control, but where they have some vested interest in the outcome? Are they asking out of genuine concern for your welfare? Sometimes. Are they being critical, implying that if they had been doing the job it would already have been finished? Quite likely. Are they bored? Hmmm … maybe, or else under stress because someone is depending on them to deliver or complete the next task.
There are admittedly people who always believe that they could do any job better than anyone else - the kind who, as Ben Elton puts it, would stand behind Van Gogh just itching to grab the brush & finish off the picture. These people are a little bit arrogant, but mostly impatient. They are not bad people, but they are bad communicators. Van Gogh communicated with colour & imagery. Taking the brush off him is like sticking your hand over someone’s mouth & finishing their sentence for them.
Do we sympathise with such people? Or do we tell them to (politely) get a life or go bother someone else? What does this say about you?
Do you communicate with someone when there’s a shared task, or simply communicate at them?
The language that we use is a small part of how we communicate, as tone, physical actions (body language), speech patterns, attention, etc, all contribute to the message that we share with our interlocutor.
Communication skill acquisition is a learning process that we all have to go through. If we don’t get the skills, then we’ll spend our lives like children trapped in the back seat of the vehicle, not caring where we’re going, only knowing that it’s taking a long time to get there. The adults in the front seat, discussing navigation, leave us out of the conversation until a voice pipes up for attention.
“Are we there yet?”