On Feb. 14, 1990, Carl Sagan gave us an incredible perspective on our home planet that had never been seen before.
As NASA’s Voyager 1 spacecraft was about to leave our Solar System in 1989, Sagan, who was a member of the mission’s imaging team, pleaded with officials to turn the camera around to take one last look back at Earth before the spaceship left our solar system.
The resulting image, with the Earth as a speck less than 0.12 pixels in size, became known as “the pale blue dot.”
“Everyone you love, everyone you know, everyone you ever heard of, every human being who ever was, lived out their lives,” Sagan later wrote. “On a mote of dust suspended in a sunbeam.” Continue reading →
Tom Scott has done a very good job explaining one of the fundamental problems of renewable energy in relation to the electricity grid and what is being done to try to fix it.
The current situation basically is that as the world is turning its energy sources from the old turbines (fuel, nuclear power, coal, gas, etc.) to renewable (wind, solar, hydro) is more difficult to maintain the balance in the grid.
The solutions are three:
The first is to storage energy in giant batteries for use in plants where necessary; but this is practically unfeasible on a large scale. To put it in context: the larger battery of this type is in China, occupies the same space as a football field and stores 36 MW: enough to power 12,000 homes only for just one hour.
The second is to generate energy by a traditional hydraulic system using excess energy at certain times of the day to pump water into a higher deposits, lakes or reservoirs. This is already being done (in Spain by example on the island of El Hierro).
The third option is more interesting: save energy in distributed batteries at homes, especially in electric cars, electronic devices of all kinds and (soon) in medium sized batteries that can be placed in any corner. In these networked systems various devices (car or photovoltaic panels that people have in their homes) can “sell” in real time their surplus of energy to the grid. Although nevertheless would still be needed some conventional power plants.
Every year the cost of power generation projects are changing in function of technological changes, changes in prices of supplies or due to economies of scale.
NREL has recently publish the 2016 version of the Baseline of Cost and Performance of most technologies used nowadays for power generation showing the flat trend of prices for conventional energy and the impressive decline in prices that have renewable energies, especially Solar PV with Capital Costs of up to 0.9 [MM USD / MW] for PV at utility scale.
I’m excited to think that the day has come when renewable energy is economically competitive with traditional energies and has reached the point where they no longer need subsidies to displace old and polluting forms of energy production.
In the last time (years) I’ve gone from been a solar researcher to consultant and then to Project Manager of solar projects. I have had very little time to write on this blog and to advance in my project to develop a program for the analysis of radiation data files.
Many times the work and personal life does not leave time for much more, and hobbies should be postponed.
As Project Manager I was responsible in juwi for the development of Tambo Real project, which has a capacity of 1.2 MWp and is located in the Vicuña, Chile.
It was a great experience to be in charge of the construction of the second largest photovoltaic park in the country so far, of which I remain many lessons learned.
I would like to share with you some pictures of the construction process.
This video is quite old, is the result of a time lapse of a day working with a friend in the solar laboratory of UTFSM trying to build a shader on the tracker for a pyranometer. The shader consisted in an arm as a inverted L with a disk that blocks the direct rays of the Sun, this arm was lifted by the arm of the tracker that supports the pyrheliometer.
Fortran 2003 is a major extension to Fortran 90/95 including many useful features, one significant feature is access to command arguments; this allows a program to take data from the execution command line and use this arguments as input information.
character(len=*), parameter :: version = '1.0'
character(len=32) :: arg
integer :: i
do i = 1, command_argument_count()
call get_command_argument(i, arg)
select case (arg)
case ('-v', '--version')
print '(2a)', 'cmdline version ', version
case ('-h', '--help')
print '(a,a,/)', 'Unrecognized command-line option: ', arg
print '(a)', 'usage: cmdline [OPTIONS]'
print '(a)', 'cmdline options:'
print '(a)', ' -v, --version print version information and exit'
print '(a)', ' -h, --help print usage information and exit'
end subroutine print_help
end program cmdline
The ability to generate high quality scientific graphics for data analysis, showing resumed contents, manipulated data and calculations is an art. For this is critical to have powerful and easy to use tools, my favorite is Plot for Mac. It costs a little to learn how to use it at first but quickly the results can be amazing.
In my last year of college I started working in solar energy research, processing large amounts of data of global, diffuse and direct solar radiation. The processing of this information was very difficult, considering that a year of measurements means more than 500,000 data measured every minute. For each data had to calculate the position of the sun and do quality control of measurements, it was impossible to do it in Excel due to long processing times and freezes. It was then that I began to process information with Fortran, everything was much faster, did not fall and was able to process lots of information quickly. The problem was that was not possible to generate graphics and that was critical for me.
This is how I reach DISLIN, a library to generate incredible graphics from Fortran, very easy to use and export to all formats. With the time I even use Dislin to generate Graphics Users Interface (GUI) for Fortran software.
Below are images with more than 200,000 points that was realized in a couple of seconds with Fortran and DISLIN.
The National Renewable Energy Laboratory of United States (NREL) has done an amazing job developing the Solar Prospector, a tool to navigate through data derived from satellite imagery with a resolution of 10×10 km. This mapping tool is designed to help developers site large-scale solar plants by providing easy access to solar resource datasets and other data relevant to utility-scale solar power projects.
Unfortunately the information is only for United States.
Rotating Shadowband Radionometer (RSR) are a cheap and precise way to measure the solar energy resource (global, diffuse and direct radiation) with a silicon photodiode radiometer and a shadow band.
This system was developed by Dr. Edward Kern, MIT professor who I personally met few years ago, who founded in 2003 Irradiance, Inc. the company which manufacture the Rotating Shadowband Radionometers (RSR2).
Rotating Shadowband Radionometer (RSR2),Image from Irradiance
The Irradiance RSR2 head unit uses a rotating curved band and a single, fast-response, photodiode sensor to measure global and diffuse sunlight. Direct sunlight is calculated by Irradiance’s computer program onboard the Campbell Scientific data logger. It includes a head unit, motor controller, temperature/relative humidity sensor, data logger, PV/battery power supply, cellular modem for remote data access, and stable, light-weight tripod. Continue reading →
Solar radiation received at the surface of the earth in a clear sky day is subject to variations due to change in the extraterrestrial radiation and to two additional and more significant phenomena:
Atmospheric scattering by air molecules, water and dust
Atmospheric absorption by O3, H2O and CO2
Scattering of radiation as it passes through the atmosphere is caused by interaction of the radiation with air molecules, water as vapor and droplets, and dust. Scattered photons (mostly at short wavelengths) produce the diffuse sky radiation. The degree to wich scattering occurs is a function of the number of particles through which the radiation must pass and the size of the particles relative to the wavelength of the radiation. The pathlength of the radiation through air molecules is described by the air mass.
Absorption of radiation in the atmosphere in the solar energy spectrum is due largely to ozone in the ultraviolet and to water vapor and carbon dioxide in bands in the infrared. There is almost complete absorption of short-wave radiation by ozone in the upper atmosphere at wavelengths below 290 nm and water vapor absorbs strongly in bands in the infrared part of the solar spectrum, with strong absorbtion bands centered at 1000, 1400 and 1800 nm. Beyond 2500 nm, the transmission of the atmosphere is very low due to absorption by H2O and CO2.
The remaining unabsorbed and unscattered photons, constitute the direct beam radiation. The total radiation flux on a horizontal surface in the presence of diffuse and beam radiation is called “global” radiation.
Figure: Normally incident solar spectrum at sea level on a clear day. The dotted curve shows the extrarrestrial spectrum.
The solar radiation that is emitted from the sun in the form of electromagnetic waves, entering the atmosphere is filtered by the gases it contains, mainly by clouds that are floating, which bounce the light to operate as mirrors. When passing a cloud, global radiation decreases dramatically and diffuse radiation increases as a result of the deviation in the direction of the aces of light that causes the cloud.
Accurate computation of solar position plays a fundamental role in solar energy applications, especially for concentrating systems. The required accuracy varies over a wide range, depending on the application: flat systems tolerate errors of a few degrees without significant losses, while high-concentration systems can require an accuracy of the order of 0.01º. More specific applications, such as the calibration of pyranometers ( Reda and Andreas, 2004 ), require an even greater accuracy.
Most of the energy that it is emitted from the Sun is in form of electromagnetic radiation with a specific spectrum given by the temperature of his external layer. The little part of this energy that arrive to earth is our source of life and energy.
The radiation that arrives to the external layer of the earth to a normal plane, before been filtered by the atmosphere is called Extraterrestrial Radiation and can be approximately calculated by :
= Solar constant, 1367 [watt/m2] dn = day number of year (1 … 365)
Then the radiation is filtered by the gases presents in the atmosfere like H2O, CO2, O3 and O2, and reflected by the clouds. The radiation that finally arrives to the lands and oceans it’s in part absorbed and part reflected.
The absorbed radiation is transformed in heat and emitted back to the space like infrared radiation.
To measure the solar radiation that arrives on Earth requires an instrument with a thermopile, if this is design to measure infrared wavelength the instruments it’s call Pyrgeometer but if is design to measure the visible spectrum It’s call Pyranometer. A good thermopile must have a flat response to the whole spectrum that is measuring.
Most of the times the most difficult step it is the first one…
Every programing languages have differents statements to interpretate in code what you wants your programs do. To compile is the process to translate from text of source code written in a defined programming language to binary machine language, creating an executable program.
The process of develop a program have some main steps, define the flowchart of information, write the code, test and optimize.
The compilation commands are different if we are in the tests and debugging step or in the optimization step, because for tests of functionality the speed it is not an issue but it is possible to use the debugger and to easily understand where are the problems.
Git is an awesome free and open source, distributed version control system designed to handle everything from small to very large projects with speed and efficiency, no matter in which language are you working. Very useful for multiple developers working at the same project.
Initially designed and developed by Linus Torvalds for Linux kernel development.
There are a few free servers for source code with Git, one of the most popular is Github.
Here is a quick reference guide for the most common Git commands.