"The octal system made sense back then"
OK, but this passage, as it stands, doesn't make sense to me. I remember octal being popular, and I noticed it going out of fashion in favour of hex, without knowing the reason behind either. If it's important now that hex, unlike octal, digits fit in bytes without leaving surplus bits, why was it not important "back then"? (When?)--Alkhowarizmi (discuss • contribs) 04:14, 1 April 2016 (UTC)
During the 1970's the CDC mainframe on which I worked had all logical components consisting of discreet physical components. This means that each "and" gate for example contained individual transistors, resistors, etc, before the introduction of integrated circuits in which a "chip" contains many logical components. Consequently, hardware was relatively expensive and that mainframe computer had a "word" of 24 bits, eight octal nibbles of 3 bits each or 4 bytes of 6 bits each. When hardware became less expensive and computer scientists wanted more characters than could fit in 6 bits (64) and also a parity bit, hardware became standardized with a "word" of 32 bits, eight hex nibbles.
The first paragraph has so many errors that it needs to be rewritten from scratch. If I do it, there's some risk (at this stage) that I replace the serious mathematical errors with minor computer science errors, which wouldn't necessarily be an improvement.--Alkhowarizmi (discuss • contribs) 00:31, 2 April 2016 (UTC)
Checking my edits?
Despite my last remark, I've been making quite a few edits where I feel confident enough. I'm hoping that if I introduce errors someone may pick them up and fix them.--Alkhowarizmi (discuss • contribs) 09:08, 2 April 2016 (UTC)
j and 1j
After writing the section about always using 1j, not simply j, to represent the imaginary unit, I played around with complex() and found
>>> complex("j") 1j
No. Your discovery is a positive contribution to Wikiversity's knowledge base. It highlights the difference between