I'm aware of at least one PS3 game port that had around 40GB of data at launch. I'd be surprised if that were the only one.
Most games' source data is much larger than the shipped data size for a number of reasons, mostly because files designed for human editing encode a lot of information not needed by the game itself, and game source data is aggressively compressed making it unsuitable for further editing. A full snapshot of the source data for a game like Battlefield 4 is likely to be on the order of hundreds of GB, maybe getting up to the TB range. And if multiple teams are working on the same source data, then the source data needs to be sent between them.
Also, before the final few months of development, games' runtime data will often significantly exceed the final shipped size because compression and quality tradeoffs have not been decided and applied.
It allows multiple (equivalent) definitions from different translation units to coexist in the same program. That's the only actual meaning attached to it by the standard.
I've got a similar view on the situation - perhaps technical people are trained or enculturated to look out for the things that can be improved/fixed, rather than taking time to celebrate progress and achievement.
Things like racism (and other related -isms like sexism and to a lesser extent, ageism) are considered bad because they can be, and typically are, exercised against people based on largely uncontrollable aspects of their outward appearance. Everyone subconsciously creates associations between appearance, race, and social status throughout their entire lives, whether they realize it or not, and then makes judgments about new people they meet in light of those associations. Those judgments based on outward appearance are part of an initial impression then taint other subsequent judgments (and actions), such those as about a person's character or intelligence. Also, people learn that it's socially acceptable and generally expected to treat (say) a black person is with less respect than (say) a white person. And entrenched ideas about what people's social status ought to be cause a feedback loop that tends to impose these ideas on subsequent generations.
There are lots of other external properties that people are generally prejudiced for or against, such as weight/height/build, (dis)ability, posture, voice/speech properties, dress sense, and so on; but these (a) are considered to be more under an individual's control, (b) aren't inherited, and (c) historically haven't caused anywhere near as many social problems as racism in the US. No doubt people who are discriminated against based on their voice (say) don't like it, but it's not considered to be a systematic, self-reinforcing, widely-observed, entrenched social problem.
A hypothetical prejudice against "SSN % 104 == 7", where the property is not even outwardly observable (so can't genreally taint initial impressions), nor subject to this ongoing reinforcement, nor passed down through generations (neither the prejudiced property, nor the prejudice itself), is completely different from race, even moreso than the other examples.
I get the impression that state laws are significantly more uniform in Australia than the US... IMHO, it's more that distributors are "used to" a lower AUD, have exclusivity/best price guarantees, and so can prevent anyone importing goods (at retail scale) that undercuts them. They've got their business model worked out on the basis of higher prices, their customers are used to higher prices, and they make arguments like "hey, well, we ran a bit of an advertising campaign and submitted your product for regulatory approval". Perhaps true, but not necessarily worth the extra "tax". See the story about how the digital price for Australian buyers of The Witcher 2 got increased because the Australian distributor made a fuss: http://www.kotaku.com.au/2011/05/why-does-the-witcher-2-cost...
There's certainly that factor, but then it even happens with the app store. Though what I think happened is that the distributors kicked up a fuss when the iTunes store was made available in Australia, so the record companies made the prices higher. Apple learnt from this experience that they could make the prices higher, so they've replicated it across the board. I guess there's also the possibility that some larger app store developers have been able to convince Apple to make the price differential higher for Australia as well.
But the reason why there is an explicit flag in the first place is that there's a lot of code out there that doesn't work with pointers >= 0x80000000, or when pointers could differ by more than that. Things like pointer "tagging", and signed integer overflow bugs, are examples of the sort of code that can be affected. Without knowing details about the Go code generation and runtime/garbage collection, it would be a bit risky to just set the flag and hope for the best.
Oh, I know; I've written bounds-checking logic for RTLs that use casts to integers and tests for negativity to check for overrun, implicitly assuming 31-bit address space. My only point is that a limitation of the Go linker does not preclude setting the flag.
Most games' source data is much larger than the shipped data size for a number of reasons, mostly because files designed for human editing encode a lot of information not needed by the game itself, and game source data is aggressively compressed making it unsuitable for further editing. A full snapshot of the source data for a game like Battlefield 4 is likely to be on the order of hundreds of GB, maybe getting up to the TB range. And if multiple teams are working on the same source data, then the source data needs to be sent between them.
Also, before the final few months of development, games' runtime data will often significantly exceed the final shipped size because compression and quality tradeoffs have not been decided and applied.