Since Facebook introduced the data-harvesting ‘Continuously Upload Contacts’ feature in settings, a change has occurred in the background (the Facebook API, for those inclined..) which prevents you downloading your friend list via a trusted 3rd party app.
In addition, the Facebook app itself no longer supports the older style ‘contact sync’ properly (or at all) on both Android and iOS.
In addition (and YMMV), the calendar sync no longer seems to work either. There is a workaround you can follow (link beneath), to create a Google calendar which syncs your Facebook contacts’ birthdays – and this is the primary reason for my post.
I used to rely on the app syncing calendar events to my phone, so that I could see at a glance whose birthday it is and send them my best wishes, but I’ve missed a few recently and now I know why.
I’m starting to wonder what benefit the native Android/iOS app is these days, versus good old mobile website access. I’m going to ditch the FB app on Android and start using ‘Tinfoil for Facebook’ instead, which looks and feels very similar but does away with the bloated spyware that the official app has become.
This post is not intended to start a flame/holy war or any other kind of religious conflict with regard to Linux desktop environments (DEs). What it is intended to do, is to simply catalogue the multitude of problems I have been encountering while using Debian Jessie and GNOME 3.14. 🙂
I LOVE GNOME (I truly do)
Let’s put this one right out there: The GNOME Shell/GNOME 3UI is, IMHO, the BEST desktop user experience out there for Linux.
“Wait,” you might say, “doesn’t this conflict with the title of this blog post?”
Well yes, it does. But I want you, my learned reader, to understand that I wish that the GNOME DE was as stable and solid as it should be. As it could be. And hopefully as it will be.
You see, this is what Linux and other Unix-like operating systems have been known and reputed for – their stability. I love what the GNOME devs did when they decided to reimagine the desktop for GNOME 3: they used space sensibly, vertically, which to me feels more natural and intuitive. And I love how it’s meant to stay out of the way – another good design motif.
But in terms of stability, sadly, GNOME has been something of a disappointment to me, and I wish this were not the case. Perhaps this is just a consequence of its ambition, and that will always garner my respect. Or maybe my install went terribly wrong, somewhere. But I don’t reckon. So, without further ado…
DISCLAIMER: WRT the issues with DebianJessie‘s implementation of GNOME Shell/GNOME 3, I shall simply refer to it as GNOME. I apologise to the purists out there. I am only commenting on my experience in Debian Jessie, not anyone else’s, nor of any other GNU/Linux distribution. Finally, I intentionally do not go into detail here and am not providing numerous distro/upstream links to “validate” my own claims. I don’t need to. If you’re interested, just search anything I have put below. I am pretty confident you will find stuff…
The 10 Problems
Have you had similar experiences to these? Do comment below.
The problems with GNOME start from the very moment you log in: it’s a disk-thrashing, sluggard of a desktop. And yes, I am using a disk, not a SSD. Why? Because badly written software doesn’t deserve a place in my CPU, let alone being so resource-hogging as to require an SSD.
So yes, Tracker is the first problem with GNOME. From logging in, all the way through your session, to shutting down your machine, it’s there – consuming all available CPU, disk I/O and (perhaps due to a memory leak), system memory. Happily gobbling it all up like a sickly child with no manners. 🙂
Perhaps I am being unfair, inferring that Tracker is “bad software”. It’s not a bad idea and its search seems to work well. But it doesn’t reign itself in. And software that doesn’t adhere to users’ choices through its own preferences panel is software that needs attention.
There are too many people/posts on the web with/of similar experiences. But, why not just disable tracking completely, you ask? Like, through the GUI you mean..? Mmm.
Screenshot showing Tracker consuming loads of everything, just after log-in.
2. Crashes and Freezes
Next up is something akin to heresy: crashing and freezing of the whole desktop UI. Seriously, it’s that bad.
You are in the middle of something, as you might be in a productive desktop environment, and BAM! no window response. That’s it. All gone. This single issue is by far the most perplexing and irritating, totally demolishing my productivity recently.
When you start searching on t’interweb about this, you realise that this has haunted GNOME for years, and in multiple versions. The nearest posts I have found on the web which seem related to the problem I have are here:
An alternative way to make GNOME hang on you is to use the live user switching. Just set up another user account, then Switch User via this menu. Then, as your new user, switch back to your original account.
Do this a few times for maximum effect, until you get stuck looking at the frozen greeter, just after it’s accepted your password for logging back in.
Enjoy the view.
It’ll last a while.
In fact, no need to take a photo. This’ll last long enough.
4. GNOME Online Accounts
Ahh, GOA. Such a good idea. Implemented in such an average way.
GNOME Online Accounts is meant to centralise internet service (or “cloud”, hwk-ding) accounts through one easy GUI component, and then share the online resources of each account with the appropriate desktop software. Think, Google Calendar being visible in your desktop calendar, which is a separate desktop application than, say, your email reader (where you could read your GMail). But no need to set up each application separately; just set up the GOA and each application gets relevant access. Get the idea?
The account set-up bit of this is, actually, great. I’m all for it too – this whole concept. It just makes so much sense.
One of the problems with it is that things don’t work properly. For example, if you use two-factor authentication in your Google account, and rely on application-specific passwords, then GOA doesn’t like that. You will be constantly prompted for your Google account password, which is never accepted.
To be fair to Jessie, I haven’t seen this happen recently, so it may have finally been plugged. Or I may just be lucky.
5. Evolution’s management of GOA’s SMTP/IMAP accounts
Another problem is SMTP/IMAP accounts. Sure, they integrate nicely with Evolution. Until you edit parts of the account in Evolution, which are more application-specific. Then, you return to your account folders list with your GOA mail account being renamed to “Untitled”. A rummage through, and edit of, the relevant ~/.config files is required to correct this error. Not so slick.
I still have hope though. One day this stuff will work great.
6. Evolution Hangs
Yep, another hangy-crashy thing. Sometimes, for no discernible reason, when you close Evolution is hangs, mid-termination. Forever. You have to send a KILL to it to actually get it to close off completely. Why? Who knows. It appears to be a timeout or spinlock type of problem. Sorry for being vague, but look, just do this Google search and pick a year. It looks like this bug has been around in one incarnation or another for a very long time.
7. Nautilus Hangs
Are you seeing a pattern here? Yep, our faithful friend and file utility, Nautilus, also hangs. Quite often. Why it does this, I have not yet been able to determine. Sigkill to the rescue. (You can do a Google search on this too…)
8. Standby and resume with remote file system mounted
It might be chunky, but the T420 is a solidly-built machine, with good internals.
Now, I admit, this is a silly thing to do when you look at it, because you are clearly asking for trouble if you have a remote filesystem mounted into your own filesystem, and then put your machine to sleep for a while.
You can make the problem worse still, if you have laptop with a docking station. Simply put it to sleep, undock, wake the machine, then reconnect using your wireless instead of ethernet. The outcome varies from a locked desktop (where nothing works), to a frozen nautilus.
Again, a silly thing to do, perhaps, but also an innocent mistake at times. Like, when you’re rushing to attend a meeting, for example.
So, why not be offered a notification, when requesting to “sleep” the machine, saying that remote filesystems are mounted? I think even I might be able to knock up some code for that one (but I’d prefer to leave it to the experts, who I respect fully and who would do it far better than I).
9. Audio Output Switching
GNOME allows a nice, quick way of locating and launching its Settings dialogs.
As you may have gathered from previous comments, when it comes to GNOME I am primarily a business user. My business runs and relies on GNU software & Linux. For the experience and knowledge I have gained – not to mention being able to sustain an income and lifestyle I’m happy with, I am indebted to many people for their determined efforts in the free software community.
Unfortunately, little bugs creep in here and there – that’s the rule of life. One minor annoyance with Jessie, that wasn’t present in its predecessor Wheezy, is automatic audio output switching. In Wheezy, after a small tweak to the kernel module loading (via /etc/modprobe.d), the audio output would be directed to my docking station’s analogue jack when the laptop was docked, and then automatically switch to the laptop’s speakers when undocked.
Unfortunately, in Jessie, when my laptop is docked I have to hit the Super (Windows) key and get to the Sound preferences, then switch the output device. After undocking, the same story. This is, apparently, fixed upstream, but regressive and annoying nonetheless.
The “Search and Indexing” preferences in GNOME Shell. I think the idea was to make things easier. :-/
10. The long pauses and (what seems like) catastrophic resource “sharing”
This is so subjective an issue that I thought it barely worth mentioning, but an issue it is nonetheless. And one that I actually feel is perhaps the worst of all.
When key processes are busy in the GNOME Desktop Environment – say Tracker for sake of argument, the “hit” on the rest of the system is shocking. Right now, as I type this blog entry, any mouse-based GUI interactions are extremely sluggish. This could be the reason why:
top - 16:34:34 up 2:00, 2 users, load average: 16.31, 15.97, 13.93
So what is causing such a load on my machine? It doesn’t take long to figure it out, in top:
PID USER PR NI VIRT RES SHR S %CPU %MEM TIME+ COMMAND
9187 smd 39 19 2239548 210440 34852 R 83.7 1.3 3:50.74 tracker-extract
9148 smd 20 0 693940 59696 8652 S 7.6 0.4 4:33.53 tracker-store
For reference, my trusty ThinkPad T420 uses a 2nd gen Core i7 processor (dual core w/hyperthreading), 16GB DDR3 memory (dual channel), a 64GB mSATA SSD system drive and 500GB Seagate Momentus 7200.4 drive for my /home. It’s a set-up that’s still powerful enough for getting things done, and I’ve grown quite fond of this chunky, heavy laptop (by 2016 standards). Yes, it’s a bit clunky now, but it’s still got it where it counts, and has only required minimal servicing over the years (since 2011).
Back to the main issue, though. You see, I grew up on Amigas. Fully pre-emptive multitasking spoilt me, and I’ve never looked back, or sideways, since. These days, all modern operating systems provide significantly more advanced multitasking and far, far more powerful hardware, but the user’s needs should always come first in a desktop environment. So, having an unresponsive desktop for hours, because a non-GUI process is taking too much CPU and I/O, is not a productivity boon, to say the least.
And just when you thought my tirade was complete, for a special BONUS…
11. Dejadup/duplicity and the inability to restore a backup!!
I love how well integrated Dejadup is into Nautilus. It’s a neat idea, to be able to just navigate to anywhere on your file-system and then say “hey, you know what? I wonder if that file I was looking for used to live here?“, or “I really must restore the earlier version of this file, or that file…”.. And so on. It even states on its website, that it “hides the complexity of doing backups the Right Way (encrypted, off-site, and regular) and uses duplicity as the backend” [my link].
‘GNOME Backups’ was designed to facilitate exactly this, using the Dejadup/duplicity combo, with two main Nautilus integration actions. Firstly, you can right-click in a folder (on blank space) and select “Restore missing files”. Or, you can right-click on a specific file and select “Revert to previous version”. In either case, a dialog will appear prompting you to select a date, from a range of dates when backups occurred. Great, huh?
Except a backup is only good when you’re able to restore it. I was not able to restore mine. The “Revert” functionality simply failed, every time I tried, with a “File not found in archive”-style error message each time. I also tried restoring the entire backup, which also failed. This issue pretty much covers it.
So, perhaps using duplicity (and not Duplicity) as the backend is exactly what it does. I don’t trust it with my back-ups. For that job, I use BackInTime.
Conclusion: I STILL LOVE GNOME
I was originally going to entitle this blog post, Debian’s GNOME is a broken user experience, but shied away from making such a bold, and somewhat unfair, claim. However, it’s hard not to conclude that this might actually be the case.
GNOME 2 used to be amazingly solid. In fact, in my younger years I didn’t use it because I perceived it as being a little boring, instead opting for KDE (v2, then v3) as my go-to desktop for quite a while. I would love to have the stability of GNOME 2 – at least as I experienced it – just in GNOME 3 form.
The biggest problem about GNOME 3 / Gnome Shell, is that I like it so damn much. For me, despite all the wrinkles and annoyances, the occasional memory leaks of “background” indexing processes, the frequent hanging of various applications and the seemingly (at times) untested nature of the software, it’s actually brilliant. It’s fast, feature-full, yet fluid. That’s a rare combination in software.
For me, it’s faster to work in than any other DE, because it combines enough functionality with equally enough transparency. For instance, when I am editing a client’s website files and want to upload them, Nautilus is the hero – allowing me to quickly mount the remote filesystem, upload my files, and then disconnect. No need to launch additional software for that task. We’re just moving data from one filesystem to another, right? That’s what a file manager does and, in the main, Nautilus is exceptional at it.
As an Emacs user, I know I could do a similar thing using Tramp and Dired mode. And I’ll keep that as an option to probably explore someday soon.
I’ve been using Debian for some time now, migrating away from Fedora on my netbook to start with, and then later on my main work laptop. In general it’s an operating system that does so much right, it’s hard when things occasionally don’t work as expected.
I won’t say that Jessie’s innings with GNOME have been the best; fair from it. But hopefully we can look forward to a smoother experience as time goes on.
I used to typically find New Year celebrations a mixed blessing.
Sometimes they can remind you of all the good, great, sad and bad events of the concluding year, in a way that makes you grateful to be alive and with loved ones. Other times, the gratitude can give way to pensiveness, reflection and perhaps also regret.
Having One’s Cake
This new year (2015 into ’16) was a little different, though. Following a very busy but also very rewarding year, the period over Christmas gave me opportunity for reflection and redirection.
2015 was a “solid” year. And by that, I basically mean unrelenting. It was a year without a single week off for annual leave, which proved extremely tiring as the autumn months came around. At the same time, after a gruelling late summer with web projects “galore”, things started a gentle easing towards the end of the year.
It wasn’t to last, but amidst the business of work projects was also a number of social engagements which provided plenty of entertainment and some light relief! All of which was enjoyed on social media, of course (want to connect? Find me on whatever you use).
Always room for a little freshly-made cake, right?
Naturally, life eventually returns to matters of work, which I love. This year, my focus is on quality and quantity. Well, if you can have both, why not!
My belief is always that there is no substitute for quality . I apply this principle to all the work my company, Warp Universal, is commissioned for by clients, and to all hosting services too. I’m currently working on some ideas to further guarantee the highest quality project management and delivery to clients, whatever the challenges!
Providing quality support is paramount in my eyes. I have always been proud to offer good support to our customers, but this hasn’t been without its challenges (being forced to quickly reconsider data storage, in the wake of Schrems vs Facebook, being the most recent).
Building Up Organically
Managing a micro business is no mean feat, as anyone who has done so will testify. At one time, I considered growth to be the largest (and perhaps only) signifier of a successful business. But this is false, and I’m glad I realise that now. Many struggling businesses are those that have grown too quickly, without enough consideration, or without the ability to back-off sales satisfactorily. It’s my intention to grow the business, organically, sustainably and vertically.
2016 is looking to be a very promising year for Warp Universal 🙂
Alongside work, 2016 is looking to be a great year for my surfing. Not because the weather patterns look particularly convivial to it, nor that my free time is that much greater than it was before. It’s simply that I want to surf more in 2016, and I’m in a position to make it happen.
Along with that, it’s definitely a year to align my media production with media consumption. A great love of mine is music, and work commitments have often meant I’ve lost touch with newer acts on the scene. I look forward to reconnecting via a music subscription service.
The year ahead is an interesting prospect. 365 days remaining from today (leap year, remember!) to achieve so many goals. And not forget that life is short, so a little fun should be had also.
Discovering the IndieWeb movement was a 2015 highlight for me. It addressed many of my concerns about the direction of the modern internet, especially regarding ownership and control over that data. But to truly own your own data, self-hosting is a must!
Background: Self-hosting your own stuff
I’m an ideas person. I have a number of projects – or, rather, project ideas – lined up, which I need to record and review. My blog provides me with the ideal space for that, as some ideas may attract the attention of others who are also interested. But why does this matter?
As someone who naturally likes to share experiences and knowledge, I see no benefit in not sharing my ideas too. After all, the web is all about sharing ideas. This matters to me, because the web is widely regarded as the most valuable asset civilised society has today (aside from the usual – like natural resources, power, warmth and sustenance)!
Owning your own data
As a small business owner, I sometimes benefit from various common business practices. For example, the standard accounting principle of straight-line depreciation means that after several years, capital assets once purchased by the business have little-to-no use for the business, meaning they become potential liabilities (both in the financial and risk-management sense). This means I am able to get hold of used, good-condition computing hardware of 4-5 years old at very little cost.
Even 10 year old servers still make for good general purpose machines. I’ll be using one of these for this blog, soon. Expect plenty of caching!
This is useful for me, as a blogger and an IndieWeb advocate, as I can not only publish and manage all my own data, but also physically host my own data too. As I have fibre broadband running to my house, it’s now feasible to serve my blog as reasonable speeds with 10-20 Mib/sec upstream (“download speed” to you), which is sufficient for my likely traffic and audience.
This ties in nicely with one of my core beliefs, that people should be able to manage all their own data if they choose. I am technically competent enough, and have the meants at my disposal to do it. So why not!
Another driver towards this is that I wish to permanently separate “work” and “pleasure”. My business web hosting and cloud service is for my customers. Yes, we host our own web content as a business, but personal content? Well, in the interests of security and vested interests, I am pushing towards making personal content something that is only hosted for a paying customer.
Of course, I would encourage anyone to start their own adventure self-hosting too!
Many bridges to cross
Naturally, taking on this type of arrangement has various challenges attached. Here is a selection of the tasks still to be achieved:
Convert some space in house for hosting
Create a level screed
Sort out wiring
Fire detection/resistance considerations
Power supply (e.g. UPS)
Get server cabinet & rack it up
Configure firewall(s)/routing accordingly
Implement back-up – and possibly failover – processes
Step one: documentation
Whilst I am progressing these endeavours, it would be remiss if I didn’t document them. There is a lot to be said for the benefits (to a devop, anyway) of hosting one’s own sites and data, but naturally my blog must carry on while I am in the process of building its new home.
A quick jiggle around of my site’s menu structure will hopefully clarify where you can see this work, going forwards (hint, check the projects menu).
Taking it from here
If you are interested in hosting your own servers and being in direct control over your content/data, why not subscribe to this blog’s RSS feed or subscribe by email (form towards footer). Or if you have comments, just “Leave a Reply” beneath! 🙂
Let’s be clear from the outset: there’s no word that adequately defines MozFest. The Mozilla Festival is, simply, crazy. Perhaps it’s more kindly described as chaotic? Possibly. A loosely-coupled set of talks, discussion groups, workshops and hackathons, roughly organised into allocated floors, feed the strangely-complimenting hemispheres of work and relaxation.
Nothing can prepare you for the 9 floors of intensity.
How MozFest works
Starting from the seeming calm of Ravensbourne’s smart entrance, you stroll in, unaware of the soon-experienced confusion. A bewildering and befuddling set of expectations and realisations come and go in rapid succession. From the very first thought – “ok, I’m signed in – what now?”, to the second – “perhaps I need to go upstairs?”, third – “or do I? there’s no obvious signage, just a load of small notices”…. and so on, descending quickly but briefly into self-doubt before emerging victorious from the uneasy, childlike dependency you have on others’ goodwill.
Volunteers in #MozHelp t-shirts, I’m looking at you. Thanks.
The opening evening started this year with the Science Fair, which featured – in my experience – a set of exciting hardware and software projects which were all in some way web-enabled, or web-connected, or web-controlled. Think Internet of Things, but built by enthusiasts, tinkerers and hackers – the way it should be.
“Open Hardware” projects, interactive story-telling, video games and robots being controlled by the orientation of the smartphone (by virtue of its gyroscopic capability).. the demonstration of genius and creativity is not even limited by the hardware available. If it didn’t already exist, it got designed and built.
An Open Web, for Free Society
A multitude of social and policy-driven themes permeated MozFest
As made clear from the opening keynotes on Saturday morning, MozFest is not a place for debate. Don’t think this as a bad thing. The intention is simply to help communicate ideas, as opposed to getting bogged down in the mire of detail. “Free” vs “Open”? Not here. The advice given was to use one’s ears much more than one’s mouth, and it’s sound advice – no pun intended. I have generally been considered a good listener, so I felt at home not having to “prove” anything by making a point. There was no point. 😉
Several themes were introduced in the keynote speeches which really resonated with the attendees – sorry, the participants of MozFest. That of online security and surveillance, more than two years after Edward Snowden’s revelations, was as prominent as ever. Participation was another key theme, and to me one of the most poignant ideas of the whole weekend. Participation was not encouraged or expected; it was simply threaded into the very fabric of one’s presence. You participated, to a lesser of greater degree. This was one of the most socially inclusive experiences I have ever known.
Stories by the Fireside
I cannot overstate how social inclusion at all levels permeated MozFest. From the smallest of teams – 2 individuals, to the largest groups I saw, people were constantly engaged in conversation, development – personal, social and technical, and – perhaps surprisingly – quiet reflection, too.
Creativity and individuality – there’s a lot of it
Quiet zones were available for those needing a little downtime. The cerebral intensity of the weekend is clearly felt.
The concept of the fire-side story appeared several times, reminding us that the web isn’t just a resource in and of itself, but rather a medium to convey information. Storytelling, one of the oldest methods of such conveyance, was a prescient theme. Represented through journalism, community and leadership, the scale of recognition (and a reminder) that the web is, primarily, a means to convey stories, took me somewhat aback. It’s inescpable logic, almost lost amidst the omnipresent noise of today’s social media.
Looking to the Future
Not only was MozFest a means to appreciate, understand and build upon the means to share information, it was also firmly invested in its future. Science and education were extremely well represented by group talks, workshops and forums.
Pathways were a means for guiding participants through the plethora of activities.
In fact, the sheer number of topics on offer, and guaranteed clashing of events sure to interest you, simply went to prove one thing: the web is not just big, it’s bigger than you can imagine. How the event planners and coordinators of MozFest actually found a way to combine the multitude of themes and interests into “Spaces” and “Pathways” is a huge credit to the thought-leadership behind this event. By encouraging leadership, the Mozilla Foundaiton has shown itself to be a more-than-capable leader in as diverse a field as there can be.
What I learned at MozFest
On arrival, I didn’t know what to expect. First-timers don’t. I had a vague incling that I would face a learning curve, adapting to the culture and activities of the event. Like a wandering spirit, I probably stared starry-eyed at the overwhelming number of quickly-scribbled “adverts”, pinned, taped and hung up everywhere, telling me about “this event” or “that workshop”. Even now, in reflection, I feel that the above post barely scratches the surface of the experience.
It’s sensory-overload, pure and simple. 🙂
MozFest is a journey. Physically, many people made long journeys to attend and participate. To those people, I am grateful – you have made my life richer by your efforts. But psychologically, emotionally and intellectually MozFest is so much more than the sum of its multitudinous parts: It’s an idea, a belief that together we can build something better for much time to come; build something to last that has intrinsic “goodness”. And we are not actually talking about the web. The conversation has evolved. The web might be the medium, but the story is now about us.
The question is, how do we nurture our most sublime nature, and be all we can?
I had been cultivating a fascination with Jekyll for blogging for a short while. It looked oh so clean, and minimalist, and sleek. It has its fans, for sure, and I am one of them.
If I were starting my blog from this day, I would almost certainly consider using Jekyll for it, rather than WordPress.
WordPress: better the devil?
But, I am not. Back in 2007 (can it really be so long ago?!), when I started blogging, I didn’t give much thought to my requirements eight years down the line. And the funny thing is, they have hardly changed.
Org2Blog is everything I need from blogging. It’s quick, because I can compose my text in Emacs, and also supply my category and tag information directly too.
When saving the post in Emacs, I can save a local copy using the same date-title-based file name schema that Jekyll would expect (e.g.: 2015-10-28-Assessing_Jekyll_as_an_alternative_blogging_platform.org).
Further benefits to Emacs/WordPress duality
As indicated by the previous filename example, blogs can be saved locally on my hard disk in Org-mode format, allowing me the option later on to convert everything for a Jekyll-based future. In other words, making the decision to hard-switch from one system or another need not be rushed and can, in fact, be assessed based on technical need.
Another “turn-off” from Jekyll is that, despite various attempts to make it easy to migrate WordPress posts, I found the process awkward and the documentation confusing. There is more than one way to skin this cat.
For me, Emacs provides such a comfortable environment using Org2Blog that it’s really hard to justify the alternative approaches of org-jekyll or Org+Jekyll.
Disadvantages to using WordPress
Well, it’s not elitest 😉
But aside from that, there are a few serious disadvantages. And these are ones you already know about: there’s lots of (potentially-vulnerable) PHP running, which is a security risk and also makes WordPress … slow.
Also, WordPress makes microblogging, or “notes” in IndieWeb parlance, not very easy. I want to publish my own microblog on my site and publish it elsewhere, but this will take futher investigation.
WordPress, also, has a reputation. It’s a bit like Walmart (or Asda in the UK). It’s a great, hulking CMS that everyone knows. It’s everywhere. Everyone uses it. Which means there’s less that’s “special” about it. And that’s a shame, because for all of that it’s really quite brilliant.
What WordPress gives me
Managing SEO settings per-post in WordPress
Like others, I’m a firm believer in the IndieWeb movement, but I don’t have enough time to write software for personal use right now. Luckily, many talented and dedicated individuals have stepped up and kindly donated their time and code to enable the IndieWeb on WordPress sites. This suits me down to the ground. At least I can support the movement by advocating and using their code.
WordPress also gives me flexibility. If I wish to write a short post about some coffee I’ve tried, I can. Picture too. If I wish to incorporate a video or music in a page fo rsome reason, the built-in editor makes that effortless. As it does, embedding a tweet too. WordPress is doing favours for the web at large, by keeping our writing options open and encouraging open sharing, rather than feeding us silo-centric drivel-data that we see so often from certain social networks!
One last thing WordPress gives is the ability for people who are not computer-confident to use a device like a Chromebook, or even their phone, and still provide a compelling and easy-to-use platform for sharing content.
Free software such as Linux is great at many things, including keeping your data very safe. That is, if you are in relative control of it yourself.
Transferring sensitive files from one machine to another – offline, via USB stick.
Linux is also used by the likes of Google, Facebook, et al., not to mention most western governments. In fact, its flexibility, suitability and cost-effectiveness means it’s pretty much there, in most pieces of consumer electronic equipment, plus any networking kit employed in telephone exchanges and data centres, through to the end points – the receiving servers which constitute “the cloud”.
Its use and application is rich and strange: sometimes in your interests, and often, arguably, not so. But whether you’re a Linux/UNIX, Windows or Mac user, taking care of your own data is vital for a life of value!
Making your digital life private, again
Is it possible to retract data that you previously opted to store online, and be confident that cloud service providers no longer keep it stashed somewhere? There are two responses to this:
Let’s assume for a moment that “yes” is, by far, the prevailing truth. “Yes”, data which I previously uploaded was properly deleted when I deleted it, and an online service provider no longer has any copy, nor any meta data about my data (ok, I’m laughing now).
“there are many ways in which you can protect your data, and protect your privacy”
Many of us have done it: uploaded photos to Google Photos, posted images or event information to Facebook, shared our location on Twitter, set up an account on … well, the list goes on. But forgetting the “privacy” policy of such entities, just for a second (well, ok then – it’s not that easy to put aside “We store data for as long as it is necessary to provide products and services to you and others”, but even so!!), there are many ways in which you can protect your data, and protect your privacy. It starts with a little effort and time.
First things, first: get a backup routine!
Rome wasn’t built in a day, and neither is your data security. For example, consider the following:
Is any of your personal data stored on company equipment?
Do you absolutely know, hand on heart, that your data is backed up?
Did you go ahead and do that yourself?
Did someone sign a certificate and say, in no uncertain terms, that they did that on your behalf?
When did you last audit your data?
A friend of mine recently lost years of pristine digital photos due to a failure of company equipment (“the company laptop”) and because he hadn’t backed them up to a secondary device – even though he had one of sufficient capacity in his possession!
Don’t let this be you! Get a routine in place for backing up. Even if it’s only monthly, usually cameras and phones have enough capacity to store a month’s worth of shots.
Designate somewhere safe for your backup!
A safe location can be anywhere. You don’t have to get a fireproof safe – although I’m not saying don’t! But if you backup your personal data at home, try not to keep your backup at home. A USB drive costs so little these days, that it’s the perfect medium for backing up photos and then taking it to work and locking in your desk drawer.
Encrypting your data is always a good idea for a removable storage device, provided you can easily remember a strong password. Although the ease of managing encrypted removable storage varies between operating systems (note, it is very easy to encrypt data on GNU/Linux).
Test restoring from your backup and backup again!
A backup is no good if you can’t restore files from it. Luckily, with a simple backup process you can easily monitor and validate that your backups have occurred successfully. If you are confident that your system backups work ok, do another one. Then store. Wash, rinse, repeat.
How does this keep my data private?
By setting a rule for yourself to back-up your own data, you won’t become so dependent on cloud services for backing up your photos.
Common objections to keeping data off the cloud include the oft-argued (but ill-conceived) notion that it’s free of cost. Let’s just examine this for a brief moment:
Data centres cost hundreds of thousands, to millions of £/$/€ to build
Running costs are tens to hundreds of thousands of £/$/€ each month
They must be staffed, too – requiring monthly salaries
If everyone is uploading for free, how can it pay for itself?
There must be an end-purpose: the end does not justify the means!
The value of your “free” data storage is in the metadata that is stored with it. Tied to your user account (that same user account you might use to log in to other services, signifying your activity at other times even when not using the primary service…) is data – in the form of metadata – that describes it quite clearly.
What photo metadata tells my cloud provider about me
That photo which was kindly synced to your cloud provider’s account will contain data, like:
Where you were (where you live, work,visit, or where friends, family live, work, etc)
What local time it was (when you may not be working, placing you into a social demographic)
What equipment you were using (which brand you like to buy)
What network you were using (who you are a customer of)
What the weather was like at the time of the photo
Who you were with from the faces of people you were with & photographed … thus registering where they were at that time too (thanks to facial recognition technology and perhaps against their will)
Due to prominent colouring in the photo, whether you were inside or outside
… and much more.
When free is not free
If I am a massive indexing engine and I start aggregating and analysing these data, I will be able to determine some interesting trends:
How many people use my service in an area/region/country
How many people who use the service were in a particular area/region/country at a specific time
How many of those use Camera brand “B” or Phone brand “A”
How many faces I recognise (people who have opted in to facial recognition)
Who is in whose “networks” and extended networks (friends of friends)
How many faces I don’t recognise (potential targets for acquisition – new users)
How many people like being outdoors on a bright, dry day
And how many don’t
Whether you like being outdoors … or not
Who you like being with during those conditions
What you might be doing at that time, on that type of day, in those conditions, with those people, while using your “brand X” device.
we are now at a stage where it is easier to get a phone, and rely on Facebook for photo storage
Some people I know seem apathetic towards online security, and yet suspicious towards cloud service provider’s intentions too. Perhaps we are now at a stage where it is easier to get a phone, and rely on Facebook for photo storage, than to “bother” seeking alternatives. “The answer is not readily to hand, so let’s move on.”
Living a life less ordinary
The problem with systems is that they need parameters. Do a search on something, somewhere, and you’ll be sure to see ads and sponsored links of that thing, somewhere else. This is, and has been for a while, the new internet “norm”.
Breaking out of this “think, search (hunger); feed (consume with contextual data)” lifestyle has been described as the “search bubble”. A self-fulfilling data management and presentation matrix based on your lifestyle habits.
By adopting a simple routine such as taking care of your own data and not subscribing religiously to online services, it’s possible to find not only more sanctity in life’s unique moments, but also more richness from the due consideration of others. Where people know you a little less, and are curious to know you a little more.
Looking for a free software program to help me learn to touch-type, and shortly after my search started I found GNU Typist. And GNU Typist (gtypist) is a gem.
The instructions are simple and the purpose of the program is equally simple: to “condition” the user into adopting and maintaining good typing habits. After starting the basic lessons (“Quick QWERTY course”), it soon became clear that my touch-typing capability was far poorer than I had hoped and my typing speed these days is generally just luck-driven.
Thankfully, a considerate fellow called Simon Baldwin decided to write gtypist, and here we are. The online documentation is equally useful; not only do you get help regarding how to acquire, install and invoke gtypist on your machine, but also a list of alternative free software typing programs which are a good fit in various situations (general, education, games-playing, etc). Like most GNU software, a man page is also provided.
It is so easy to take this effort for granted, and yet how useful is this resource! Such is the way with free software: quite often, somebody already had that itch and had to scratch it.
Brasilia style – a good option for any Saturday morning
I’m consciously reworking my way through Taylors’ range of filter coffee. Many times before we’ve had the Italian style medium roast and, in addition, we’ve also gone for the number 6 strength “Hot Lava Java”. But sometimes you need a milder option, to more gently ease you into into a state of caffienated bliss.
Score out of 5 … 3. It’s a mellow cup, but perhaps missing some depth to the flavour. Not disagreeable at all though. Would still recommend.