In this article, we’ll be looking at a very simple RxJS 6 pattern that allows us to easily refresh data coming from REST APIs (or any other async data source).
In Angular, we’ll be using a service to handle most of the work, including the data fetching and the refresh logic.
Let’s look at a mock service that we’ll be using to fetch some data from a REST API.
TypeScript
exportclassMockService {refresh$ = newReplaySubject(1);data$: Observable<SomeDataType>;constructor(privatehttp: HttpClient) {this.refresh();this.data$ = this.refresh$.pipe(switchMap(() =>this.getData()),share() ); }refresh() {this.refresh$.next(); }// This one could also be privategetData(): Observable<SomeDataType> {returnthis.http.get<SomeDataType>(`url`); }}
Different planet usually means different atmosphere. Different atmosphere means different optics mumbo-jumbo (thoroughly explained across the internet, like here). Different optical properties usually lead to different sunset colours (and also there seems to be some sort of evidence that the atmosphere composition may be correlated to the ability to support life, but it’s mainly about sunset colours).
Obviously, NASA created a sunset simulator, because why not.
With my job, I often need to replicate projects from a template based on Angular Material. When I do that, one of the first things that I usually need to change is the color scheme.
Angular Material’s color scheme is defined by two palettes: primary and accent. These two palettes define a range of colors used by all components.
There are many tools available online to generate such palettes. My personal favourite is Material Design Color Generator (also available on GitHub). This tool includes a code generator that automatically creates a SASS list in Angular Material’s accepted format.
It’s been a while since the last post, but this was worth it.
Sometimes, our infinite quest to discover…well, everything…brings us material that is both fascinating and interesting for everyone.
This usually means images instead of numbers, but this time we’re talking about 3D CG awesomeness!
Our casting for the day features Jupiter and Juno, the dynamic duo that certainly doesn’t fear the spotlight.
Their performance? A deep dive into Jupiter’s Great Red Spot, to discover what lies underneath the cloud tops.
“The solar system’s most famous storm is almost one-and-a-half Earths wide, and has roots that penetrate about 200 miles (300 kilometers) into the planet’s atmosphere. Juno found that the Great Red Spot’s roots go 50 to 100 times deeper than Earth’s oceans and are warmer at the base than they are at the top”
Data was gathered by Juno’s Microwave Radiometer (MWR), which is able to “look” through Jupiter’s clouds by analyzing different lengths of microwaves.
However, we still don’t know what the future of the Great Red Spot will be. While it may have existed for more than 350 years, it has been shrinking quite rapidly.
And as always, thanks NASA!
How many interactions happen every second on the Internet? You can check the numbers anywhere on the web, OR you can go to http://onesecond.designly.com/. Web designer/ dev Steven Lewis created a fantastic website along the lines of “If the moon were only 1 pixel”. Scroll your way through thousands of Google searches and don’t forget to … Read more
Disney Research found a different mathematical approach to solve the extremely complex equations necessary to render realistic fabrics.
As you can see from the video released with the official paper, the resulting simulation is extremely accurate. Like almost every algorithm, it’s not a generic solution to every cloth rendering problem. It performs better on large problems (more than 25k vertices), stiffer materials and small masses. Also, as stated by the authors, time to solution doesn’t really scale linearly, but there’s room for improvement.
Anyway, it’s a great addition to the 3D rendering algoritms’ world.
There was a time in human history when no living creature had ever seen the hidden face of our beloved satellite.
As you know, earthlings can only admire one side of the Moon. The other side can only be seen from space.
For millennia the laws of physics stood in the way of our quest for knowledge, until one day we learned to use such laws to our advantage.
To go back to that day, we need to set our time machine to an astonishing…
56 years ago
Yep, it was the 26th of October, 1959. A lot of people still alive today was born at a time when we didn’t even know what the other side of the Moon looked like (let alone landing on it).
On that day, as described on the official mission page:
[quote style=”boxed”]The Luna 3 spacecraft returned the first views ever of the far side of the Moon. The first image was taken at 03:30 UT on 7 October at a distance of 63,500 km after Luna 3 had passed the Moon and looked back at the sunlit far side. The last image was taken 40 minutes later from 66,700 km. A total of 29 photographs were taken, covering 70% of the far side. The photographs were very noisy and of low resolution, but many features could be recognized. [/quote]
The Russian spacecraft was equipped with an analog camera, an automated film processing lab, a scanner and a transmitter. Yes, it was cutting edge tech back then.
These instruments produced the first picture of the B side of the Moon. The people who looked at the picture as it was transmitted back to Earth, also happend to be the first humans who actually saw the other side of our satellite.
Today, there’s a little guy up there called Lunar Reconnaissance Orbiter taking better pictures (among many other things). Here’s a nice comparison made by NASA and published on this article.
Have you ever tried those magical pieces of software that merge multiple pictures of an object from different angles to produce a 3D model of it?
Good. Now think about upgrading your equipment, because those guys at UCLA do the same with atoms. Seriously.
Using a scanning transmission electron microscope at the Lawrence Berkeley National Laboratory’s Molecular Foundry, Miao and his colleagues analyzed a small piece of tungsten, an element used in incandescent light bulbs. As the sample was tilted 62 times, the researchers were able to slowly assemble a 3-D model of 3,769 atoms in the tip of the tungsten sample.