"many people are operating outside their area of expertise on this subject."
Exactly. I takes years of really hard work to get good at this stuff. Decades.
I do realize research budgets are not that awesome, but when claims are of aesthetic in nature (explicitly and implicitly) and deal with human craftmanship there should definetly be collaboration with also craftsmen subject experts.
A good example where this was executed really well was the Notre Dame reconstruction (I _guess_). Craftsmen and academic diligence hand in hand.
Not everyones archeological reproduction has such a budget unfortunately.
> I do realize research budgets are not that awesome, but when claims are of aesthetic in nature (explicitly and implicitly) and deal with human craftmanship there should definetly be collaboration with also craftsmen subject experts.
Do we know for a fact this didn't happen in this case?
While photography destroyed academic art almost to extinction, thank heavens it's still trained and you can find practicing artists. Finding good ones might be a bit hard though.
So you could find a _bad_ artist to help you in your reconstruction project.
But finding an incompetent accomplice probably is not in anyones best interest.
So while hiring _anyone who claims to be an artist_ might be procedurally and managerially an approved method, it really is not the outcome anyone actually woudl want to have. So whatever happened here ... it does not count as professional reconstruction.
You don't need to be an art historian or an artist to recognize this.
You just need to compare them to other art from the period and the frescoes, and consider which one you find more appealing. And once you do this, there is a fair chance you will recognize the "good" art feels like an order of magnitude more appealing to you, even if you don't have the training to recognize the exact features that cause this appeal.
An awful lot of the things hanging in museums look "bad" to me. I'm not just talking about the easily-mocked contemporary art. I mean things like Medieval paintings with Jesus painted as a baby-sized adult man. Everything before the development of perspective looks like a grade-school cartoon.
I'm sure you're right that reconstructions of painted statues are inaccurate. But I'm not sure that a good-looking reconstruction would be any more authentic. Cultural tastes vary a lot. I suspect that if we ever do get enough data for a valid reconstruction, I won't like it any better.
> An awful lot of the things hanging in museums look "bad" to me. I'm not just talking about the easily-mocked contemporary art. I mean things like Medieval paintings with Jesus painted as a baby-sized adult man. Everything before the development of perspective looks like a grade-school cartoon.
Perspective wasn't developed! The Greeks and Romans used it just fine, for example.
What was lost was artistic training because there wasn't sufficient economic market for it. As soon as you got sufficient economic incentive, art magically improves again. This is stunningly obvious if you look at Athens and then Pompeii and then Rome and then the Vatican (with the attendant backslide until the Renaissance as you note).
Interesting parallel to modern--will AI cause a huge backslip in art since the economic market for artists is being destroyed?
"since the economic market for artists is being destroyed"
I don't see it being destroyed. I mean the market for art. That's a market for tangible things made by specific humans, pieces that are unique.
Very hard to see how AI will affect that since the market is dominated to large extent by the need by the art salespeople, art institutions, and art collectors to sustain prestige and investment value.
If it just about volume, China would have destroyed it decades ago. Clearly adding even more volume will hardly put a dent to it.
> An awful lot of the things hanging in museums look "bad" to me
Sure. But if have a chance to visit Pompeii, the author’s argument will land. The Romans made beautiful art. It seems odd that they made beauty everywhere we can find except in the statues we’ve reconstructed.
I'm not sure whether they look "bad" is enough justification. The author dismisses the possible explanation "maybe they didn't consider this bad style back then" without any real argument other than "there are other works of art with different styles".
I agree that I, personally, do not consider them painted in a way that is pleasing to me. But is that what the reconstruction project is meant to achieve, i.e. a painting style that is pleasing to current audiences? Or is it about reconstructing the bare minimum that can be asserted with some degree of reliability that is actually supported by the physical evidence?
Again I must ask: do we know decent artists weren't involved in the reconstruction project? Remember, the goal is to use their artistry to achieve scientific results, not just do whatever they find pleasing.
> You just need to compare them to other art from the period and the frescoes, and consider which one you find more appealing
I get this is the most compelling part of the argument TFA is making, but to be honest I don't find it all that compelling. Surely the people involved in the reconstruction considered this, and there's a reason why they still produced these reconstructions, and I don't believe that reason is "they are incompetent or trolling".
I believe it is basically irresponsible to present the statues with their base layers only. Either extrapolate the aesthetic top layers that might have been there, or just report that the statues were painted without a visual example. Presenting them as poorly as they do contributes to demoralization and a sense of alienation from one's own cultural roots.
I believe researchers are under pressure not to extrapolate too wildly, unless they can find strong evidence for their extrapolations. In TFA itself they are quoted (very briefly) saying this is not a representation of what the statues actually looked like, it's just the pigments they guaranteed were there.
> Cecilie Brøns, who leads a project on ancient polychromy at the Ny Carlsberg Glyptotek in Copenhagen, praises the reconstructions but notes that ‘reconstructions can be difficult to explain to the public – that these are not exact copies, that we can never know exactly how they looked’.
Consider that had they gone wild with creativity, they would have been criticized for it. Apparently the current overcautious trend is an (over)reaction to previous careless attitudes in archeology.
This is my uninformed take, anyway. I think TFA's author should have engaged more directly with researchers instead of speculating about their motives; the article -- while making some interesting points -- reads a bit snarky/condescending to me. Why not go straight to the source and ask them?
"This is almost certainly not what it looked like at all, and it's hideous, but I am going to make sure this image is disseminated across the literature and the news (which will make everyone think it was actually hideous but oh well)" is just more irresponsible in my mind than any alternative.
The article makes very explicit proofs, in showing paintings of painted sculptures, where the sculptures are painted with very appealing, naturalistic hues.
I think the museums should hire trained academic artists to do best guess reproductions next to the garish ones.
The garish ones are _equally_ misleading.
Imagine you got a reproduction of a "five year old with finger paints" version of Mona Lisa and you were told this was made by a person considered a geniuous in his time and an artistic giant. What would make that think you of his patrons and him?
"decide what the software is supposed to do in the first place."
After 20 years of software development I think that is because most of the software out there, is the method itself of finding out what it's supposed to do.
The incomplete specs are not lacking feature requirements due to lack of discipline. It's because nobody can even know without trying it out what the software should be.
I mean of course there is a subset of all software that can be specified before hand - but a lot of it is not.
Knuth could be that forward thinking with TeX for example only because he had 500 years of book printing tradition to fall back on to backport the specs to math.
At work I'm implementing new 3D map geometry stuff for my employer (Mapbox) and as a a sideproject I'm building a simple 3D modeling software that gets you from idea to reliable, solid parts fast (https://www.adashape.com/).
They wanted to build a spaceship (project Orion), but first had to come up with something they could sell immediately, hence they designed the commercially very succesfull Triga reactor.
” how can you explain how megalithic 100-ton bricks structures were build by "primitives"”
How can you explain we can today build structures that are 800m tall or reroute rivers?
Honestly, good ol human craftsmanship multiplied by available labour combined with ’basic’ geometry gets you really, really far.
Industrial processes don’t require individual craftmanship because it does not scale with the speed and velocity required by markets and capital. Hence if you don’t actually care about building stuff you may think people unassisted with industrial machinery would be much more incapable than they really are.
Humans are friggin talented.
My opinion is that said structures are made by humans - a function of basic human psychology, times population, time surrounding available resources. You don’t need to add alien intelligences to the equation.
And aliens per your description - not that interested really because it sounds more like religious conspiracy theories than something actually profoun.
I’m pretty sure there’s life out there (i mean basic chemistry right) but I’m not so sure if it’s anything that would travel here intentionally or that it would have anything to say to us.
I would be happy to be wrong! That’s the most interesting outcome always.
Yes! It's a fascinating conundrum. The giant megalithic "bricks" perfectly fitted and "moulded". Amazing craftsmenship. Entirely possible a forgotten technology was utilized by humans, or by something "before us". Antoher technological Earth-based civlization that wasn't "human" but the only traces of which remain are their megalithcs -- spooky, almost as if they knew they had to leave a stone legacy!
My God, they're rock rectangles cut with reasonable precision! They aren't millimeter replicas of each other.
They float the stones from the quarry. Then prepare a plum surface using water, which lays perfectly flat at rest. They measured out a height with a replica-sized reed. Then they use taught twine to chalk the mark. Finally, you pour hard sand on the chalk line and use ropes back and forth in the groove to grind into the soft stone.
It takes time, but it's a process so simple a child can learn it in a day. And then you apply the scale of having a city of adult laborers just as smart (though not as learned - there's a difference) doing it for years and years. Congrats, the rocks stack into a stable shape.
And it was done multiple times and the history even recorded for some!
Then, a long time goes by and all the structures built out of other material decay. All the structures not stacked in the most stable shape fall down. And all the structures not important and out-of-date with modern ventilation or security or needs are intentionally replaced. Now you only have the special building, which some folk weirdly worship or make conspiracies about.
So many logical fallacies and biases go into this, it's all incredibly frustrating. And to see how this beautiful, connecting history we share is warped. To see simple human cleverness that proves how we are fundamentally the same as those who came before us, completely cast aside. It's just... GAH!
The men and women who lived when those structures were built were just as intelligent as you! Your capacity for knowledge, your curiosity, your ingenuity - all in the same proportions! They were not "primitives" or "cave people". They were smart human beings who built cool shit!
A slow compiler impedes developers velocity, not only taking longer, but breaking their concentration.
The whole point of a programming language is to be an industrial productivity tool that is faster to use than hand writing assembly.
Performance is a core requirement industrial tools. It's totally fine to have slow compilers in R&R and academia.
In industry a slow compiler is an inexcusable pathology. Now, it can be that pathology can't be fixed, but not recognizing it as a pathology - and worse, inventing excuses for it - implies the writer is not really industrially minded. Which makes me very worried why they are commenting on an industrial language.
I get the feeling author would just like to use a better language, like F# or Ocaml, and completely misses the point what makes C++ valuable.
C++ is valuable, because the existing tooling enables you to optimize the runtime peformance of a program (usually you end up with figuring out the best memory layout and utilization).
C++ is valuable becaus it's industry support guarantees code bases live for decades _without the need to modify them_ to latest standards.
C++ is valuable because the industry tooling allows you to verify large areas of the program behaviour at runtime (ASAN etc).
I simply don't understand what type of industrial use this type of theoretical abstraction building serves.
Using the metaprogramming features makes code bases extremly hard to modify and they don't actually protect from a category of runtime errors. I'm speaking from experience.
I would much rather have a codebase with a bit more boilerplate, a bit more unit tests and strong integration testing suite.
C++20 inverts the traditional relationship between the core language and metaprogramming, which arguably makes it new language in some ways. Instead of being a quirky afterthought, it has become the preferred way to interact with code. There is a point of friction in that the standard library doesn’t (and can’t) fully reflect this change.
Metaprogramming style in C++20 only has a loose relationship to previous versions. It is now concise and highly maintainable. You can do metaprogramming in the old painful and verbose way and it will work but you can largely dispense with that.
It took me a bit to develop the intuitions for idiomatic C++20 because it is significantly different as a language, but once I did there is no way I could go back. The degree of expressiveness and safety it provides is a large leap forward.
Most C++ programmers should probably approach it like a new language with familiar syntax rather than as an incremental update to the standard. You really do need to hold it differently.
As someone that has only dabbled in C++ over the past 10 years or so, it feels like each new release has this messaging of “you have to think of it as a totally new language”. It makes C++ very unapproachable.
It isn’t each release but there are three distinct “generations” of C++ spanning several decades where the style of idiomatic code fundamentally changed to qualitatively improve expressiveness and safety. You have legacy, modern (starting with C++11), and then whatever C++20 is (postmodern?).
This is happening to many older languages because modern software has more intrinsic complexity and requires more rigor than when those languages were first designed. The languages need to evolve to effectively address those needs or they risk being replaced by languages that do.
I’ve been writing roughly the same type of software for decades. What would have been considered state-of-the-art in the 1990s would be a trivial toy implementation today. The languages have to keep pace with the increasing expectations for software to make it easier to deliver reliably.
As someone that has been using C++ extensively for the last 25 years, each release has felt as an incremental improvement. Yes, there are big chunks in each release that are harder to learn, but usually a team can introduce them at their own pace.
The fact that C++ is a very large and complex language and that makes it unapproachable is undeniable though, but I don't think the new releases make it significantly worse. If anything, I think that a some of the new stuff does ease the on-ramp a bit.
C++ can be written as the optimal industrial language it is. Simple core concepts year after year. Minimal adaptation.
The key thing to understand you are still using C with sugar on top. So you need to understand how the language concepts map to the hardware concepts. So it’s much more relevant to understand pointer arithmetic, the difference between stack and heap allocations and so on, rather what the most recent language standard changes.
You can write the same type of C++ for decades. It’s not going to stop compiling. As long as it compiles on your language standard (C++17 is fine I think unless you miss something specific) you are off to the races. And you can write C++17 for the next two decades if you want.
> Metaprogramming style in C++20 only has a loose relationship to previous versions. It is now concise and highly maintainable. You can do metaprogramming in the old painful and verbose way and it will work but you can largely dispense with that.
This was my takeaway as well when I revisited it a few years ago. It's a very different, and IMO vastly improved, language compared to when I first used it decades ago.
If you're going to go through the effort of learning a new language, it makes sense to consider another language altogether, one without 30 years of accumulated cruft.
An advantage is that if you already know the older language then you don’t have to learn the new idioms up front to use it. You can take your time and still be productive. It isn’t why I would use it but it is a valid reason.
I have used many languages other than C++20 in production for the kind of software I write. I don’t have any legacy code to worry about and rarely use the standard library. The main thing that still makes it an excellent default choice, despite the fact that I dislike many things about the language, is that nothing else can match the combination of performance and expressiveness yet. Languages that can match the performance still require much more code, sometimes inelegant, to achieve an identical outcome. The metaprogramming ergonomics of C++20 are really good and allow you to avoid writing a lot of code, which is a major benefit.
I only which concepts were easier, those of use that don't use C++ daily have to look the expression syntax all the time, much better than the old ways I guess.
Wait until people see how reflection on c++26 further pushes the metaprogramming paradigm. I'm more hopeful for reflection than I have been for any other c++ feature which has landed in the last decade (concepts, modules, coroutines, etc).
As someone that had the option to choose between C and C++, coming from compiled BASIC and Object Pascal backgrounds, back in the early 1990's.
What makes C++ valueable is being a TypeScript for C, born in the same UNIX and Bell Labs farm (so to speak), allowing me to tap into the same ecosystem, while allowing me to enjoy the high level abstractions of programming languages like Smalltalk, Lisp, or even Haskell.
Thus I can program on MS-DOS limited with 640 KB, an ESP32, Arduino, a CUDA card, or a distributed system cluster with TB of memory, selecting which parts are more convinient for the specific application.
Naturally I would like in 2025 to be able to exercise such workflows with a compiled managed language instead of C++, however I keep being in the minority, thus language XYZ + C++ it is.
Yes, managed languages are all that have some form of automatic resource management, regardless of what shape it takes, or a more high language runtime.
Using Go as example, and the being in minority remark, you will remember the whole discussion about Go being a systems language or not, and how it was backpedaled to mean distributed systems, not low level OS systems programming.
Now, I remember when programming compilers, linkers, OS daemons/services, IoT devices, firmware was considered actual systems programming.
But since Go isn't bootstraped, TinyGo and TamaGo don't exist, that naturally isn't possible. /s
> C++ is valuable, because the existing tooling enables you to optimize the runtime peformance of a program
This is true for MANY other languages too, I don't see how this makes c++ different. With gdb its quite the opposite, handlig c++ types with gdb can be a nightmare and you either develop your own gdb glue code or write c-like c++.
> C++ is valuable becaus it's industry support guarantees code bases live for decades _without the need to modify them_ to latest standards.
In times of constant security updates (see the EU's CRA or equivalent standards in the US) you always gotta update your environment which often also means updating tooling etc. if you don't wanna start maintaining a super custom ecosystem.
I don't see this as a positive in general, there is bit rot and a software that is stuck in the past is generally not a good sign imo.
> C++ is valuable because the industry tooling allows you to verify large areas of the program behaviour at runtime (ASAN etc).
Sanitizers are not C++ exclusive too and with rust or C# you almost never need them for example. Yes C++ has extensive debugging tools but a big part of that is because the language has very few safeguards which naturally leads to a lot of crashes etc..
I think the idea of using only a small subset of C++ is interesting but it ignores the problem that many people have, you don't have the time to implement your own STL so you just use the STL. Ofc it gives me more control etc. but I'd argue most of the time writing orthodox c++ won't save time even in the long run, it will save you headaches and cursing about c++ being super complicated but in the end in modern environments you will just reinvent the wheel a lot and run into problems already solved by the STL.
> handlig c++ types with gdb can be a nightmare and you either develop your own gdb glue code or write c-like c++.
That's why better to use lldb and it's scripts.
> I think the idea of using only a small subset of C++ is interesting but it ignores the problem that many people have, you don't have the time to implement your own STL so you just use the STL.
Yeah, agree. It's just much easier to take a "framework" (or frameworks) where all the main problems solved: convenient parallelism mechanisms, scheduler, reactor, memory handling, etc. So it's turning out you kinda writing in your own ecosystem that's not really different from another language, just in C++ syntax.
Better language? Well, now mix those with C libraries thst you need and make them generate code as efficient as C++ (I would assume people use C++ for a performance advantage of some kind in many scenarios).
> With std::format, C++ has gained a modern, powerful, and safe formatting system that ends the classic, error‑prone printf mechanisms. std::format is not merely convenient but fully type‑safe: the compiler checks that placeholders and data types match.
Solid remark, and the consensus on how std::printnl and std::format are an important improvement over std::cout or C's printf.
I'll bite. printf might be unsafe in terms of typing, in theory, but it's explicit and readable (with some caveats such as "PRIi32"). The actual chance of errors happening is very low in practice, because format strings are static in all practical (sane) uses so testing a single codepath will usually detect any programmer errors -- which are already very rare with some practice. On top of that, most compilers validate format strings. printf compiles, links, and runs comparatively quickly and has small memory footprint. It is stateless so you're always getting the expected results.
Compare to <iostream>, which is stateful and slow.
There's also std::format which might be safe and flexible and have some of the advantages of printf. But I can't use it at any of the places I'm working since it's C++20. It probably also uses a lot of template and constexpr madness, so I assume it's going to be leading to longer compilation times and hard to debug problems.
I my experience you absolutely must have type checking for anything that prints, because eventually some never previously triggered log/assertion statement is hit, attempts to print, and has an incorrect format string.
I would not use iostreams, but neither would I use printf.
At the very least if you can't use std::format, wrap your printf in a macro that parses the format string using a constexpr function, and verifies it matches the arguments.
_Any_ code that was never previously exercised could be wrong. printf() calls are typically typechecked. If you write wrappers you can also have the compiler type check them, at least with GCC. printf() code is quite low risk. That's not to say I've never passed the wrong arguments. It has happened, but a very low number of times. There is much more risky code.
So such a strong "at the very least" is misapplied. All this template crap, I've done it before. All but the thinnest template abstraction layers typically end up in the garbage can after trying to use them for anything serious.
The biggest issue with printf is that it is not extensible to user types.
I also find it unreadable; beyond the trivial I always need to refer to the manual for the correct format string. In practice I tend to always put a placeholder and let clangd correct me with a fix-it.
Except that often clangd gives up (when inside a template for example), and in a few cases I have even seen GCC fail to correctly check the format string and fail at runtime (don't remember the exact scenario).
Speed is not an issue, any form of formatting and I/O is going to be too slow for the fast path and will be relegated to a background thread anyway.
Debugging and complexity has not ben an issue with std::format so far (our migration from printf based logging has been very smooth). I will concede that I do also worry about the compile time cost.
I largely avoided iostream in favor of printf-like logging apis, but std::format changed my mind. The only hazard I've found with it is what happens when you statically link the std library. It brings in a lot of currency and localization nonsense and bloats the binary. I'm hoping for a compiler switch to fix that in the future. libfmt, which std::format is based on, doesn't have this problem.
"The article" is ambiguous. The one this HN post is about does not argue for it, at all. But the one in the comment above directly says,
> Don’t use stream (<iostream>, <stringstream>, etc.), use printf style functions instead.
and has a code example of what they argue 'Orthodox C++' should be, which uses printf.
I'm all for a more sensible or understandable C++, but not at the expense of losing safety. In fact I would prefer the other way: I still feel incredibly saddened that Sean Baxter's Circle proposal for Safe C++ is not where the language is heading. That, plus some deep rethinking and trimming of some of the worst edge behaviours, and a neater standard library, would be incredible.
Exactly. I takes years of really hard work to get good at this stuff. Decades.
I do realize research budgets are not that awesome, but when claims are of aesthetic in nature (explicitly and implicitly) and deal with human craftmanship there should definetly be collaboration with also craftsmen subject experts.
A good example where this was executed really well was the Notre Dame reconstruction (I _guess_). Craftsmen and academic diligence hand in hand.
Not everyones archeological reproduction has such a budget unfortunately.
reply