assigned a value that is never used

Saturday, 12 August 2017 11:01 am
bens_dad: (Default)
[personal profile] bens_dad
Whilst writing some (C++) code, to parse a file that I only roughly know the contents of, I have been printing variables at various points in the code.

I realised that this printing was masking the fact that some variables are not being used, so I inserted some conditional code

// Printing variables with CPLDebug can hide
// the fact that they are not otherwise used ...
#define CPLDebug(...)

so that compilers and other code analysers would not see the printing function, so correctly give warnings like:
[VRC.cpp:976]: (style) Variable 'nNotTIS' is assigned a value that is never used.

That was fine. Until I hit this block of code:

if ( this->nMapID == 8 ) {
int nNotTIS = VRReadInt(fp);
"Pay as you go; skipping value %d=x%08x before Tile Index",
nNotTIS, nNotTIS); // See * below

The variable nNotTIS *is* only used in the print, but = VRReadInt(fp) has a side effect (moving the file pointer) that is intended. In a strong sense, I do never use value read from the file and the warning is correct.

I think I will add a comment like
// Expect a warning like
// Variable 'nNotTIS' is assigned a value that is never used.
but is there a better option ?

* Unfortunately CPLDebug() does not support %m$ in the format string, eg "... %1$d=x%1$08x ...", so I have to pass nNotTIS twice if I want to see the decimal and hex values.

Oh hey, there it is

Friday, 11 August 2017 09:40 am
rmc28: Rachel smiling against background of trees, with newly-cut short hair (Default)
[personal profile] rmc28
I was commenting last night to a couple of friends that I was not as fatigued by the holiday as I had expected.  And then as I got back to our apartment in the early hours this morning, I felt that familiar drag set in.  Spoke too soon!   So Tony is getting the 'night passes' for the rest of the con and I'm going to be pulling bedtime cover with the children.

(This holiday so far is being an excellent illustration of just how much we have life-at-home optimised for everyone's needs and just how much work it is to cope without those optimisations.  I'd thought my physical stamina was going to be the limiting factor on what we got done; instead it's the family's collective emotional comfort level with being in new places and Doing Stuff.)

Before staying up too late talking to lovely people, last night I danced my legs off at the Clipping concert.  Clipping's hip-hop Afrofuturist dystopian concept album is up for the Hugo award for Best Dramatic Presentation: Short Form, and the con managed to persuade them to come over and play a gig to a bunch of geeks.  The queue for entry was long, and the room was set up with seating, but the band basically said "ok, we're not allowed to get rid of the chairs - we asked - but there's a lot of space here at the front", which was enough to get [personal profile] ceb up and dancing, and I followed.  It was ace.  I think that about 90% of the population right in front of the stage was female-presenting (and within that, mostly white, and mostly around mid-thirties or older).  I am not sure this is Clipping's usual audience demographic? I had a moment of looking around and realising I was dancing in the vicinity of a number of amazing women who I admire greatly and just feeling overwhelmed and joyous and incredibly lucky to be there at that time.  (Speaking of,[personal profile] mizkit also liked the gig.)

So I not only danced at a Clipping gig a few metres away from Daveed Diggs, I had a short appreciative conversation with him in the bar afterwards, and my internal squee may not stop for days.

Totally worth being shattered today.

It's actually happening

Saturday, 5 August 2017 06:16 am
rmc28: Rachel smiling against background of trees, with newly-cut short hair (Default)
[personal profile] rmc28
A bit less than two years ago, I was in a hospital bed creating a googledoc named Helsinki, with open in another tab, starting to build up the shape of the holiday we could have using Worldcon as an anchor.

Now I'm in an airport hotel room, about to wake up the children and go get our flight to Helsinki.

Deoptimisation can be a virtue

Thursday, 3 August 2017 11:43 am
simont: (Default)
[personal profile] simont

There's a well-known saying in computing: premature optimisation is the root of all evil. The rationale is more or less that tweaking code to make it run faster tends to make it less desirable in various other ways – less easy to read and understand, less flexible in the face of changing future requirements, more complicated and bug-prone – and therefore one should get into the habit of not habitually optimising everything proactively, but instead, wait until it becomes clear that some particular piece of your code really is too slow and is causing a problem. And then optimise just that part.

I have no problem in principle with this adage. I broadly agree with what it's trying to say. (Although I must admit to an underlying uneasiness at the idea of most code being written with more or less no thought for performance. I feel as if that general state of affairs probably contributes to a Parkinson's Law phenomenon, in which software slows down to fill the CPU time available, so that the main effect of computers getting faster is not that software actually delivers its results more quickly but merely that programmers can afford to be lazier without falling below the ‘just about fast enough’ threshold.)

But I have one big problem with applying it in practice, which is that often when I think of the solution to a problem, the first version of it that I am even conscious of is already somewhat optimised. And sometimes it's optimised to the point of already being incomprehensible.

For example, ages ago I put up a web page about mergesorting linked lists; [personal profile] fanf critiqued my presentation of the algorithm as resembling ‘pre-optimised assembler translated into prose’, and presented a logical derivation of the same idea from simple first principles. But that derivation had not gone through my head at any point – the first version of the algorithm that even worked for me at all was more or less the version I published.

Another example of this came up this week, in an algorithmic sort of maths proof – I had proved something to be possible at all by presenting an example procedure that actually did it, and it turned out that the procedure I'd presented was too optimised to be easily understood, because in the process of thinking it up in the first place, I'd spotted that one of the steps in the procedure would do two of the necessary jobs at once, and then I devoted more complicated verbiage to explaining that fact than it would have taken to present a much simpler procedure that did the two jobs separately. The simpler procedure would have taken more steps, but when all you're trying to prove is that some procedure will work, that doesn't matter at all.

I think the problem I have is that although I recognise in principle that optimisation and legibility often pull in opposite directions, and I can (usually) resist the urge to optimise when it's clear that legibility would suffer, one thing I'm very resistant to is deliberate de-optimisation: once I've got something that has been optimised (whether on purpose or by accident), it basically doesn't even occur to me in the first place to make it slower on purpose. And if it did occur to me, I'd still feel very reluctant to actually do it.

This is probably a bias I should try to get under control. The real meaning of ‘premature optimisation is bad’ is that the right tradeoff between performance and clarity is often further towards clarity than you think it is – and a corollary is that sometimes it's further towards clarity than wherever you started, in which case, deoptimisation can be a virtue.


bens_dad: (Default)

August 2017

67891011 12

Style Credit

Expand Cut Tags

No cut tags
Page generated Wednesday, 16 August 2017 03:18 pm
Powered by Dreamwidth Studios