Search
Close this search box.

OSAM-1: Proving Satellite Servicing—Starting with Landsat 7

OSAM-1: Proving Satellite Servicing—Starting with Landsat 7

Source: NASA’s Small Steps, Giant Leaps Podcast

OSAM-1 capturing satellite
Artist’s concept of OSAM-1 docking with a satellite. Image credit: NASA

In Episode 91 of NASA’s Small Steps, Giant Leaps podcast, OSAM-1 Lead Systems Engineer Wendy Morgenstern discusses NASA’s On-orbit Servicing, Assembly, and Manufacturing 1 mission to robotically refuel Landsat 7—a satellite that wasn’t designed to be serviced.

Services such as tuning up or refueling a satellite have been impossible for most spacecraft, but the OSAM-1 mission is set to change that paradigm. OSAM-1 is a robotic spacecraft equipped with tools, technologies and techniques needed to extend satellites’ lifespans. In addition to extending the life of Landsat 7, OSAM-1 can give satellite operators new ways to manage their fleets more efficiently and get more value from their initial investment. NASA is transferring OSAM-1 technologies to commercial entities to jumpstart a new domestic servicing industry. The Space Technology Mission Directorate at NASA Headquarters and the Satellite Servicing Projects Division at NASA’s Goddard Space Flight Center manage OSAM-1.

===Transcript===

 

Wendy Morgenstern Photo credit: NASA
Wendy Morgenstern Photo credit: NASA

Wendy Morgenstern: Within NASA we have been working on this kind of servicing idea for decades. If you think back to the iconic Hubble Servicing Mission, that was a start. We were first inventing tools for astronauts. But the potential could be seen over the years of, ‘Well, what if we did it without astronauts? What if we could do more and more robotically? What if we could do the whole servicing assembly and manufacturing and build more and more complex things in space?’

Deana Nunley (Host): Welcome to Small Steps, Giant Leaps, a NASA APPEL Knowledge Services podcast where we tap into project experiences to share best practices, lessons learned and novel ideas.

I’m Deana Nunley.

NASA’s On-orbit Servicing, Assembly, and Manufacturing 1 mission, or OSAM-1, will be the first to robotically refuel a satellite that wasn’t designed to be serviced. The mission will also demonstrate assembly and manufacturing technologies and capabilities.

Wendy Morgenstern is the Lead Systems Engineer for the OSAM-1 mission.

Wendy, thank you for joining us on the podcast.

Morgenstern: I’m happy to be here.

Host: Could you give us a quick overview of OSAM-1?

Morgenstern: Sure. So first I’m going to demystify some of the acronym gobbledygook because NASA loves to speak in acronyms. So I just want to orient everyone and break down the word OSAM. So you’ll sometimes hear OSAM or you’ll hear ISAM. Both of them have to do with the same priorities. So, it’s On-orbit servicing, assembly, and manufacturing. So servicing, in very simple terms, is fixing a spacecraft on orbit. This can be repairing something, adding things to it. I’ll talk about what our particular mission is doing in this area. And then assembly is more construction in space with pre-manufactured pieces. So, I think of that as taking pieces and we’re going to assemble them in orbit, and we’ll talk about how our mission’s doing that. And then manufacturing, the ‘M’ in OSAM as I say it, is just taking raw materials and turning it into infrastructure to be used in space. So I just wanted to give that general background.

You’ll also hear the term ISAM for In-orbit Servicing, Assembly, and Manufacturing, as opposed to on-orbit, but they’re really the same thing. So for OSAM, we are a pretty rich mission. So we are launching in around 2026, and we are planning to service Landsat 7, which is a legacy spacecraft. So once we launch, we go through autonomous rendezvous and docking with Landsat 7. Landsat 7 was never designed to be serviced. So this is a great landmark figuring out how to track down and get a hold of and work with legacy spacecraft. After we get a hold of Landsat 7 and finish autonomous rendezvous and docking, we will capture them, so robotic servicing. We put them into our berthing system. So now we’ve become one super spacecraft or stack with both our servicing vehicle and then Landsat 7 sitting on top of this. This gets us up to 7,000 pounds of spacecraft to throw around.

So, then we sit there with our robotic arms that we take with us having successfully captured, and we’re starting to take apart Landsat 7 to do some servicing. And I can describe those tasks a little bit more when we talk about the technologies we’re developing. But after we finish refueling and closing up Landsat 7, that finishes out the ‘S,’ the servicing part of our mission, and then we move on to the assembly and manufacturing.

We also carry a payload. So SPIDER’s payload is the Space Infrastructure Dexterous Robot, just to spell out the acronym. So, SPIDER will do the ‘A’ and the ‘M’ in OSAM. So we take to space, a three meter antenna broken into seven pieces. If I want to paint a visual picture while we have on the audio, I think of it as a petal of flowers. So there’s a central element and then six elements around that will turn it into a three meter communication dish. So we take those seven pieces with us to orbit and we assemble them robotically. And that we do a couple of times for repeatability, and I’ll talk about why that’s a good technology to demonstrate out. And that is our very, very brief ‘A’ in OSAM assembly.

Then with manufacturing, we have a piece of SPIDER that we call colloquially MakerSat, another piece of the payload, but it is what you would call an additive manufacturing demonstration technology. My younger niece would call it a 3D printer. We take our 3D printer to space, and we pull true to 10 meter beam. And our experiment there is to not only demonstrate that we can manufacture that beam in space, but to measure the stiffness that it is above our goals and, in fact, how stiff we can make it. So that just gives us a benchmark for really, literally, printing beams in space.

Host: So, talking more about what SPIDER will demonstrate, what can we expect?

Morgenstern: So, what you can expect from SPIDER is, we’ll see, like I said, the assembly, manufacturing. So going a little further down into the details, if you’ve got the visual in your head of a robotic arm and then stacks of, I will call them seven petals. So it’s a pretty intricate task where we have to take a robot arm, release the stacks of petals that have survived launch because we have to tie them down very tight for launch, and then we have to move a boom into position and with the robot arms start picking up the petals and putting a central element on and then robotically assembling all the way around. This requires a lot of very careful design and consideration in the early stages. Each of these, you have to think about, ‘OK, an operator on the ground who may be working without the three to six second communication delay, is the one running these tasks.’ So, you have to think very carefully about where your cameras are going to be to let the operator see so they guide the robot arm. You have to think very carefully about how you’re going to create attachment points. So, the end of a robot arm — just to teach you the jargon — we usually call it some sort of end effector. But think of it as the fingers, the end of the robot arm. We can switch in and out different tools or different fingers on many of our robot arms.

SPIDER, in particular, has a dextrous end effector that’s made to pick up — it almost looks like a very simple monopoly piece. If you can think of a game piece that would have a vertical shaft and a wider ball at the top. So that gives us something kind of common for the robot arms and effector to pick up. And then we have to very carefully decide where we’re putting those on each of the pieces that we’re going to be assembling and how we’re going to be able to see it. So, think about if you’ve ever tried to assemble anything, except we have to pre-plot every move. How are we going to pick it up? How are we going to be able to see it as we go?

Host: That sounds really fascinating. So what’s the mission timetable?

Morgenstern: So, we will be launching in 2026. We will run our on-orbit series of experiments, as I call them, for six months to a year. And so, at this point, we’re in the middle of build. We’ve passed and closed out our Critical Design Review and we’re in the middle of hardware all over the garage floor as we bring together the servicing payload and the SPIDER payload, as well as our spacecraft.

Host: So far, Wendy, what are the biggest engineering challenges?

Morgenstern: I love this question. So one of the things you have to think about for OSAM is that OSAM-1 is a technology demonstration mission. So we are supposed to have a lot of engineering challenges. And another really cool thing about this mission is that the people who are building it are the people who are flying the experiment on orbit. So we will envision build, test our technology, take it to space, operate it, tune it up, try things a couple of different ways to get the most out of our mission, so to talk about even just a few of our biggest engineering challenges.

So, NASA history geek moment here, I feel like we’re in the Apollo 7 zone. Apollo 7 was the mission that first took up the command module and the lunar module, but in low-Earth orbit and tested out a ton of technologies we were going to need to land on the Moon. We’re doing the same sort of thing. So, Apollo 7 had a ton of firsts and so do we as a stepping stone into the future.

So, our first big challenge is, as I’ve described it, is autonomous rendezvous and docking. So this means we have to launch and match orbits with Landsat 7 in a very safe and autonomous way. There won’t be any pilot flying this vehicle and there won’t even be a pilot trying to fly the vehicle from the ground because human reflexes can’t overcome the six second time delay we may have. So it’s all autonomous. That is a big difference. And if you think of the challenges we see here on Earth with self-driving cars, that’s some of the challenges we’re trying to do in space. But we’re trying to do it by bringing together two vehicles that are running about 17,000 miles per hour.

We start out with our space vehicle about 30 kilometers below Landsat 7 and then we go through rendezvous in a series of very deliberate steps. We’re closing on Landsat 7 by decreasing the range, the distance between the two spacecraft. And about a kilometer out, think of yourself driving a car that’s on the far horizon, so L7’s now kind of a dot. We found it and reusing our Lidar. If people are not familiar with Lidar, that’s a Light Detection And Ranging where we ping a laser out and we get the return and that will give us both range and bearing. So, the distance to where we’re seeing the Landsat 7 dot on the horizon and the direction, the bearing off our bow, to use an analogy.

So, once we have our spot, again, we go through our next step of closing in the distance. And you’ll hear this talked about in the OSAM community, in general, is far field work and near field work and then we get into proximity operations when we’re close enough to the thing we want to rendezvous and then we’re all the way up to capturing.

So, for us, as we walk through those stages I’m describing, we carry a wide variety of cameras to give us a good picture of the target as we close the distance. So we are using cameras, both in the visual range and in the infrared range, which is new. Infrared, we’ve played with a lot on orbit, just seeing how things look in the infrared, but closing a control loop around it. To drive your car that is a significant new guidance, navigation and control technology. So there’s another big engineering challenge for us.

So, when we carry that variety of cameras, what it’s doing is what you’ll also hear called in the community and the technology, in general, is the machine vision portion of it. Like a human would look at a picture where I look at and when we’re driving a car and react to what we’re seeing, we’re now teaching this, the entire spacecraft, to be smart enough to take those cameras — and this is yet another layer of new technology for us — and turn them into a picture to recognize and then respond appropriately for steering in safely and autonomously.

So, we do this with a series of what we call supervised autonomy, because we’re an experiment. So, we have a series of way points where we go, ‘All right. Go for the autonomy.’ And we’ll stop at around 90 meters out, 30 meters out. And then we take a last final look. If you can imagine this: Two vehicles going 17,000 miles per hour, hanging five meters apart. And then we say, ‘Go for the final autonomously,’ and we move in until Landsat 7 is within reach of our just under two-meter robot arm, and our robot arm will reach out and capture Landsat 7.

So now we get into the incredible amount of tools that we take to space, tools or items that we can put on the end of our robot arm. So the first one we pick up and use is a gripper tool, which does exactly what you think. I’m painting by the name of the visual picture as it grips onto Landsat 7. So when the machine vision has managed to take a picture and figure out where it needs to reach out with a robot arm and where it can successfully grab on to Landsat 7, after that we close down the gripper tool and capture it. This is a very tricky operation with a lot of fun, intricate technical details up. You’ve got two vehicles in space in this soup of a space charging environment. They’re both at different electrical potentials. We could shock each other.

So, the gripper tool has been designed and redesigned several times to make sure that we’ll first barely touch and bring the two electrical potential into, I’ll call it rendezvous alignment, before we actually physically grab on. Then we close and grab on. And now if you can think about it, we have that huge Landsat 7 observatory on the end of a robot arm.

So, our next step is to go through, turn it over to our ground robotic operators. So now we move off autonomy and into pre-planned human in the loop. So we go through a series of moves to try to pull Landsat 7 down into our berthing posts. So if you go online on YouTube and google OSAM-1 you’ll see some great videos of this, but I’ll paint the picture. We’ve got three posts that sit up above our servicing deck and the robot arm guides Landsat 7 into those until we grip onto the Marmon ring. Marmon ring’s, typically, the thing you see on the bottom of every satellite that it clamps onto a rocket. So that’s what we’re using, is what we’re attaching Landsat 7 to our space vehicle with.

Once we’ve got it there, now we start picking up all the other tools on the deck. So we have a great number of different tools. And the first thing we have is we have an autonomous tool drive. Think of that as your drill you might use here on Earth but with cameras attached all over it, including zoom lenses in and out. So now I have that on the end of my robot arm and what I’m going to do is I’m going to pick up tools to do a series of different tasks.

So, my very first task, and this is another first in space, is I will be cutting open the thermal blankets on Landsat 7. Never been done before, and robotically opening the fill and drain valves. Now, I have fueled five different spacecraft in my career at NASA and I am used to humans going in and being able to turn the valves and hook up hoses and triple check things and fuss with the seals and make sure we’re all tight. We’re doing all of that robotically, all of it pre-planned. And so that is another huge gamechanger.

So, we have a tool we’ve invented called a quick disconnect tool. Once we manage to get the traditional fill and drain valves off, we’re going to put a cap in place called quick disconnect, and that lets us now bring up our propellant transfer system. Here’s another first in space. So paint this picture in your head. If you’ve ever tried to put gas in your car and the hose is fighting you or turning a little way in your hand and you’re trying to get the nozzle in the tank. We have to do all of that robotically with a hose we extend without getting it bound up or tripped.

So, our partners at Kennedy have really been leading our propellant transfer system and this has been an amazing piece of technology to come together. We just weld it in, our piece, into the flight vehicle, so that lets us hook up a hose to Landsat 7 and refuel it because it’s pretty low on fuel. It’s under about 10 kilograms in the tank by the time we get there to refuel it. So once we transfer over a whole bunch of fresh hydrazine, Landsat 7 is refueled and ready to go for a number of more years.

But we’re still not done with all the things we have to do because we also have to put the gas cap back on. So that quick disconnect tool has been carefully designed that now that we detach the hose, it self seals, but I’ve still got the fuel valves exposed to the freezing temperatures of space. So we have to close back out all those thermal blankets we cut open. So we have yet another tool, a thermal close out, that looks pretty much like a cap. Think of a square box hat we put back over all of those and have to robotically put in place.

So now Landsat 7 is sealed up righty tidy and we have to reverse everything we did. We’re going to boost it first to a new orbit so it doesn’t even have to use some of its new fuel. So that’s another technology we want to demonstrate is that we can relocate the vehicle in space. And then we have to undo everything we did and let it go and wave goodbye and let it go on its way, now refueled and ready to do more things.

So that is a lot of technology right there. And that’s just in the ‘S’ part of OSAM.

Host: Wow. So that’s a lot about servicing. But tell us more about assembly and manufacturing.

Morgenstern: So, assembly, so the particular piece I talked about, we’re assembling a communications dish. So we have a lot of experience on the ground of assembling communication dishes, assembling large dishes, assembling large things. But what we’re constrained by on the ground is we’ve got to fold them into a launch vehicle fairing. Here, we’re not constrained by that. We’re, literally, taking the pieces with us and we’ll be assembling them in space and that is a huge milestone for a lot of future technologies. So we have a lot of experience building a single dish on the ground, but now what we’ve got to do is we’ve got to pre-make the pieces so that when we click it all back together, it meets the curvature requirements to do good radio frequency communication.

So, once we get it all assembled, we’re going to do something, even another new thing, is we are going to characterize how good the radio frequency is. So that’s another stretch goal for us is after we do the basics of assembling a large item in space, we are going to communicate with the space network and just, literally, do what we call slice cuts, moving back and forth our comm across the connection to the TDRSS space network satellites and just see what kind of signal we get out of it. That gives us a performance benchmark, even beyond being able to just do the assembly in space. So that’s one of our stretch goals we’re all for.

Then we’re going to disassemble it. And then we plan to reassemble it a couple times because we want to practice while we’re up there. And we want to see repeatability because that’s another major thing is if we make things on the ground and we get them together in space, could we get them together a couple of different times? Could we do it repeatably over and over? So that’s one of our goals.

And then I’ll talk last about manufacturing. Our additive manufacturing MakerSat portion of the payload will pultrude, which is your vocabulary word for the day. That’s going to mean we have experimented extensively. Tethers Unlimited partners are actually manufacturing MakerSat. So they tried a whole bunch of different dies which, if you’re a 3D printing geek, is the shape that you’re running the melted material through. So we’re pultruding a U-shaped beam, nice, strong, mechanical shape. And we have a target on the end of it, a visual target.

So, if you’ve ever done any basic stiffness experiment with beams, kind of feeling flashbacks to my first physics classes, as we print it, starting from nothing and going at least all the way out to 10 meters, we’ll keep stopping along the way. And we’ll, literally, be looking for the beam to deflect in time and space following the target and figuring out, do the math on how stiff the beam is. And we’ll be able to correlate that with the data internal to the 3D printer on MakerSat of how far we think the beam’s gotten and how stiff it’ll be. So, there’s two things we’re coming up with — a measurement technique, at the same time we’re printing a beam in space.

Host: What do you see as the potential impact of OSAM-1?

Morgenstern: Oh, it’s going to be legion. So, within NASA we have been working on this kind of servicing idea for decades. If you think back to the iconic Hubble Servicing Mission, that was a start. We were first inventing tools for astronauts. But the potential could be seen over the years of, ‘Well, what if we did it without astronauts? What if we could do more and more robotically? What if we could do the whole servicing assembly and manufacturing and build more and more complex things in space?’

So, if you want to go back in history, because I’m a bit of a history geek as well as an engineer, Goddard Space Flight Center, which is where OSAM-1 is based out of, first started doing this kind of work with the Hubble servicing missions where we invented the tools, technologies, even the ability to upgrade and add to the Hubble Space Telescope.

So, the division I work with at Goddard has been developing these technologies since the Hubble Servicing Mission all those decades ago. We’re actually NExIS, NASA’s Exploration & In-space Services Division. And I love that because, when you ask me about the potential for OSAM-1, we’re actually, literally, at a crossroads, which is the alternate meaning of NExIS.

So for OSAM-1, with all of those technologies I’ve described, you’re moving into more autonomous rendezvous and docking technologies. The machine vision that I described is a significant leap forward in guidance navigation control technology. Right now when humans dock something with the International Space Station, it’s pilot supervising on something in the visual range. But we’ll be going all the way out to the computer sees, the computer decides, the computer does, and it does it in a couple of different frequency ranges.

Interestingly, we also had to invent Pixar-like animation technology. So in order to make our machine vision algorithms work and design them here on the ground, we had to take a side step into building animation as realistic as Disney’s Pixar to what Landsat 7 is going to look like and then create simulations of our cameras seeing Landsat 7. So that has been, as I mentioned the history, that’s been a fun thing going back and working very closely with our Landsat 7 partners to look at all the pictures and images we could ever find to build a virtual Landsat 7, to try to build machine vision algorithms. So that’s going to be a gamechanger.

I’ve talked about the tools that we’re using, all of which will be gamechangers. We’re looking at the gripper tool could be used many, many, many places. So that is something that we’re working to commercialize. That’s one of their big passions at NASA.

I didn’t mention this one yet, but we actually have a cooperative servicing valve. So, if I talked about Landsat 7 was built over 20 some years ago using traditional fill and drain valves to hook up the ability to fuel. Well, what if we could change the industry to use cooperative servicing valves? In other words, something that you could easily connect and disconnect to. That would make it much easier to take something like OSAM-1 next generation to space and refuel things. So we are actually flying on OSAM-1, a cooperative servicing valve. That’ll be how we fuel our spacecraft and that will get that ready to be used on all future missions.

The other thing I can think of that’s game-changing: My Science Mission Directorate folks are already looking at, if we can assemble things in space, we’re out of the limitation of the fairing. So you can now think about, I could assemble huge telescope dishes, huge communication dishes. I could assemble larger occulters for telescopes in order to, say, study an extremely brightly lit object, but maybe study the corona around it by blocking out the middle. These are all ambitions NASA’s had for a while, and if we can build them in space, we won’t be as limited by the technology required to fold stuff up and do the origami to fit inside a rocket fairing.

Host: And you talked about the Hubble Servicing Mission. Are there lessons learned and experiences from past missions such as Hubble that have helped you and the OSAM-1 team develop this mission?

Morgenstern: Absolutely. Like I said, Goddard’s first experiences definitely started in Hubble servicing, which taught us a ton of things about being able to see from the ground. So let me paint another picture in your head from iconic footage. So we’ve all seen the footage of a Saturn V rocket launching to launching the Apollo missions. If you’ve ever noticed on the side of that rocket is a black and white checkerboard pattern. That’s used to help visualize, ‘Hey, as you’ve got a camera on something and as the rock is moving by, where exactly on the rocket are you? Where is that?’ This gives you a benchmark in the frame.

So, one of the things that we spent a lot of time on is thinking about, ‘Well, how can we see precisely what’s happening?’ So, we have definitely used Hubble servicing experience. We have a series of payloads that have gone to the International Space Station for the robotic refueling missions where we looked at what’s the best kind of tool? So we’ve had three series of those where we tried tool mark one, two, and three. We tried things like inventing a cryo transfer system, which is a similar propelant transfer system. Definitely learned tons of things about how we put together visual patterns that are properly called fiducials in order for a machine vision or a camera to see, ‘Hey, the tool’s oriented this way. Or the body we’re trying to chase down and rendezvous and dock with is oriented a certain way in space.’

Host: Are there lessons that you and the team have learned in your own OSAM-1 experience that you could share with other NASA technical workers?

Morgenstern: Oh, let me give you my top two or three. And actually, one of my autonomous rendezvous and docking experts always says this: ‘For all of the lessons learned that have gotten us to this point is, to succeed, sometimes you have to be bad at something for a while.’

So, all of the machine vision work we’re doing now has definitely built on — we have a Raven payload on the space station where we’ve put up cameras and just gone, ‘What do things look like as they come toward the International Space Station?’ That’s given us some in-flight data of certain types of cameras to inform how to do machine vision. We’ve redesigned and redesigned and redesigned our machine vision algorithms trying to figure out the best way for a computer to autonomously detect, ‘Hey, what’s the pose? What’s the orientation of this vehicle we’re trying to rendezvous with or do work on?’

So that’s definitely one of the top ones when working with the Space Technology Mission Directorate at NASA Headquarters is they are all in on this. Sometimes we have to be bad at things a couple of times before we get really good at them. So developing that is the whole reason STMD, technology missions like ours, exist.

And the other thing I would say is when you’re trying new things, be prepared to be surprised, and be humble. No matter how good a model you build, you can always be surprised because your understanding of the problem as you build the model can be limited. So, one of the most powerful things that I use working on something like OSAM tech development is, think you just might be wrong, be humble, and question things. Which leads to the third lesson learned, which is applicable to almost everything I’ve ever worked with at NASA, is you want to build the highest fidelity ground test you can when you’re trying something totally new. And we’re trying a lot of totally new things. We have a whole Robot Operations Center of many complex test beds where we try all of these different technologies over and over again. So I think those are my top three.

Host: Those are super helpful. Thank you so much for sharing those. How do you think the technologies being developed for OSAM-1 will benefit commercial industry?

Morgenstern: So, at this point, our whole industry, both commercial and government, are really putting together OSAM-1 or ISAM, by either name, trying to put together standards, things like, ‘Hey, if everyone could use the cooperative servicing valve that we’ve been developing here within the NExIS Division and for OSAM, if everyone could adopt those standards, if everyone could agree on standards, we can commercialize that.’

We sponsor an annual OSAM-1 technology transfer workshop, which, literally, we try to invite anyone interested from any other government agencies, anyone in our commercial partnership and say, ‘Hey, what can we license? What can we transfer you? What can you take out of the catalog where we’ve done the work and then take it over to the private sector and make sure everyone’s got access to all of these OSAM technologies?’

We’ve definitely licensed out the cooperative servicing valve and the gripper tool and even the client berthing system, which is the proper name for, once you’ve got a hold of the vehicle you’re trying to do servicing to, how you clamp it down onto a steady work platform. So all of those have already been commercialized.

Host: And you can learn more about the OSAM-1 mission and topics discussed during our conversation at APPEL.NASA.gov/podcast. Many thanks to Wendy for joining us on the podcast. Her bio and a transcript of today’s show are also available on our website.

We’d love to hear your suggestions for future guests or topics on the podcast. Please share your ideas with us on Twitter at NASA APPEL – that’s app-el– and use the hashtag Small Steps Giant Leaps.

As always, thanks for listening.

Visit NASA’s OSAM-1 website for more.

On Key

Recent Posts

STELLA Photos

STELLA Spring Webinar: April 15, 2024

STELLA users will talk about their experiences using the DIY spectrometer during this webinar. Panelists include Bianca Cilento (RIT), Karen Karker (SUNY), and Peder Nelson (OSU and NASA GLOBE Observer).

Read More »
On Key

Related Posts

STELLA Photos

STELLA Spring Webinar: April 15, 2024

STELLA users will talk about their experiences using the DIY spectrometer during this webinar. Panelists include Bianca Cilento (RIT), Karen Karker (SUNY), and Peder Nelson (OSU and NASA GLOBE Observer).

Read More »