Most of the advanced chips (the 16 nm that Xilinx uses for UltraScale/+ and below) are flip-chip wafers and have an interposer which is basically a very dense PCB that helps fanning out the extremely dense and small pitch of flip-chip bumps. They will usually include extra low impedance ("landscape" orientation) capacitors on the substrate, which leads to much relaxed PCB decoupling requirements.
Having designed FPGA boards with both their 7th generation parts and their Zynq UltraScale parts, the internal capacitors are such a time and cost saver in terms of being able to fan out more signals without more PCB layers
I can also attest that even relatively "slow" chips like 14 nm FinFET MPUs from Renesas have decoupling caps on the substrate
I'm looking at the newest versal chips from Xilinx/AMD for a new design and buying a SoM & designing our carrier board could fit the bill nicely. We're still very early in the design process, we need to get prices for the chips too to see if it's an idea worth pursuing.
The title was edited by supposedly HN moderators after I posted it. I actually ran into this youtube channel and thought it was very interesting, since I didn't realize academia seems to make so many mistakes all the time. https://news.ycombinator.com/item?id=42728742
I just saw the two comments in question and find it to be absolutely hilarious.
A serious answer to your question - no, these are in the "whoop" weight-class. Warhead carrying drones for anti-personnel and anti-armor are usually 10 inch and up, meaning they are using "10 inch" frames and likely 3115 motors, with 6S battery packs. Of course, there's endless variation, but that seems to be a optimum combination
Whoops and 7" drones are pretty far apart (depending on what you consider "far", I guess). Whoops can barely lift their own weight, 7" drones can carry maybe a kilo.
But this is about the overall engineering of Voyager, not just the programming. Also, I'm skeptical how much better modern hardware will fare in deep space conditions, considering the use of finer and more fragile electronics. Since you're talking about general people instead of specialists, also consider how the median software developer seem to focus less on correctness, reliability, and optimization, compared to the standards of spacecraft engineering.
1) It was sophisticated indeed, top of its game, but lets not lie to yourselves. We still have best engineers and better programmers today. Just to put things in context in that time we moved from "Pong" to "World of Warcraft".
And is not just software. Reducing an entire computer room to the palm of your hand but with better storage, graphics and computing power is basically black magic. I can't imagine what Voyager could do with a current Nvidia chip.
2) Just because people is not trained in some specific domain does not mean that they couldn't be motivated to do it. I bet that the people that built the Voyager didn't born with the instructions engraved in their brains. And if they learned, other people can also.
If I learned something after lurking HN for a lot of years is to never, ever, underestimate this community. This place stills keep surprising me in good ways.
> Also, I'm skeptical how much better modern hardware will fare in deep space conditions, considering the use of finer and more fragile electronics.
Since then, we had massive advantages in manufacturing. Maybe COTS parts aren't as usable in space as they were back then, but we can now easily manufacture something more resilient or, as a fallback, simply use those old parts. Also, basically all current electronics are designed to be and are used on earth ~100% of the time. Over-engineering it for use in space is just a waste.
> skeptical how much better modern hardware will fare in deep space conditions
Why? Deep space radiation is only 4x the dosage compared to LEO. Starlink satellites use modern tech and they've spent >10,000 collective years in space since we launched more than 2 of them. The whole "modern electronics are more fragile" issue is overblown. The CPUs are tiny and easy to shield. The MMICs use huge features that you can see with a normal microscope.
Where did you get the number of 4x from? It seems different than what I understand, but I don't have any sources handy.
"Modern electronics are more fragile" issue really is not overblown. One of my peers have tested different types of non volatile memory in LEO and the TLC NAND sample gets totally wiped by ionizing radiation within the first week. CPUs, while being mostly logic and less susceptible to low energy events, can still be easily destroyed especially if radiation causes latchup. MMICs and discrete devices have huge features in comparison yes, but the junctions still degrade notably under radiation.
From my opinion as someone working on LEO satellite hardware, it's easy to have opinions about stuff like correctness and reliability because it is not naturally intuitive and usually requires observation of many samples over a long time that it doesn't affect most engineers. However, I've definitely seen a strong correlation between the effort spent on correctness and reliability, and the success of missions.
What is Starlink’s failure rate? Genuinely asking; I don’t know. My point is that if it’s > 0, that’s a problem for something designed to go billions of miles away.
Are any facts knowable? Besides the fact that something exists, in some shape or form, at this particular moment, that allows the thought about this to happen.
?? Some people do agree that the Electoral College system was more suited for a previous time where the States were more disjointed and the federal government wanted to entice each state to participate in the system by giving the smaller states more power. In modern day, a system that's closer to people's actual votes makes more sense to some people. This simply allows changing from Electoral College where a candidate can win despite having less votes than another candidate, to a more true first-past-the-post.
Of course, I think ranked choice voting is much much better but I don't think it's fair to make this sound like some secret nefarious conspiracy. It's also not apparently related to the article
Reporter. I saw a pretty cool visualization of someone's activities over a year, and decided to give it a shot. I gave up after the first day because of how annoying it is.
Cubesat Developer's Workshop? Which year was this, if you don't mind me asking?
The funny thing is that I did pretty much the same thing, I had our flight computer prototype in my hoodie pocket to fidget with (since I'm leading all the electronics for the project) but luckily we weren't travelling far and didn't get any invitations from the government folks.
Our first sat, NCUBE, never made it out of the launch canister once in space; the 2nd one was on a failed launch which probably made some Kazakh farmer's day very interesting - judging from the photos I saw, it seems it came down in a wheat field - but the third one deployed successfully, but at that time, alas, I had graduated.
It's not as big of an issue for us since we use nearly all consumer/industrial stuff with build in ESD protection. I was also using it as a way to stress test whether the board would develop problems from handling, temperature and humidity changes, shock and vibration, etc
As someone with mild knowledge of semiconductor fabrication and optics, I honestly think these technology are not that meaningful to justify their cost of development and implementation, compared to the winning solution of arbitrarily complex plastic lens assemblies.
Chromatic aberration can be good enough with doublet lens design and Petzval is largely solved by the final field flattener usually with multiple concave and convex sections, and you can still easily fit a large number of lenses in a small formfactor.
Additionally being able to adjust the power of a lens is not a huge gamechanger, as a lot of the complexity with modern optic design is to counter various defects and distortion like aforementioned Petzval.
Rather, the fundamental limit is the sensor size. It's just not practical to achieve much better image quality with a physically small system
Thanks for the information! But my understanding was exactly in the direction of the issue you point out - the fundamental limit being the sensor size.
A curved sensor would, by allowing a relatively thinner (due to fewer elements) lens assembly, could have a larger area, and still remain within the allowed overall "thickness budget" of the smartphone. Hence my surprise that they seem to have gone nowhere.
Having designed FPGA boards with both their 7th generation parts and their Zynq UltraScale parts, the internal capacitors are such a time and cost saver in terms of being able to fan out more signals without more PCB layers
I can also attest that even relatively "slow" chips like 14 nm FinFET MPUs from Renesas have decoupling caps on the substrate