I am excited to announce that I am moving to Savannah in November to join the faculty at the University of Georgia Department of Marine Science. I will be recruiting 1 graduate student and 1 postdoctoral researcher immediately, so please contact me if you are interested in joining the newly formed ZERO-C Lab at Skidaway Institute of Oceanography. Thanks to my brother, Scott, for designing the lab logo (greeri.com).
We recently published a paper describing larval fish aggregating inside of a gelatinous matrix. Many fishes spawn eggs encased in a gelatinous matrix, but in the lab, the gelatinous material dissolves within 24 hours. These images show that the gelatinous matrix can persist much longer under natural conditions and may have important implications for larval transport and interactions with predators. Read the paper here https://www.int-res.com/abstracts/meps/v614/p209-214/
We have a new paper out in the most recent issue of Oceanography highlighting the big scientific questions addressed by our interdisciplinary research consortium studying the river-dominated shelf ecosystem of the northern Gulf of Mexico.
Read the “Regular Issue Feature” here: https://tos.org/oceanography/issue/volume-31-issue-03
I spent the past week at the 42nd Annual Larval Fish Conference in Victoria, BC. There were so many different kinds of research approaches to improve our understanding of larval fish ecology. This is the principal advantage of these small conferences: many of us have the detailed background knowledge of the most important problems, but usually we lack expertise in the application of several techniques and approaches to these problems within different species groups and ecosystems. The presentations I attended and conversations afterwards certainly led me to think differently about the approaches we take in both the lab and field.
The conference got off to a great start with the opening plenary session by Janet Duffy-Anderson – one of the leaders at NOAA in their application of environmental and larval fish data to ecosystem-level questions about fisheries productivity. She was a perfect fit for opening speaker because everyone got a glimpse of all the different factors influencing the life history of fishes, and how these data can be applied to real-world problems (e.g., fisheries management, which encompasses a multi-billion-dollar global industry). She gave numerous examples of how important environmental data are in understanding shifts in fish abundances and predicted that these kinds of data will be increasingly important in fisheries management.
One of the most fascinating talks I saw was by a sensory biologist studying anchovy eye morphology. These fish can actually detect polarized light and at certain angles. This polarization sensitivity allows them to double their sighting distance (presumably to better locate prey). The researcher was able to document a change in their behavior under polarized vs non-polarized light conditions. The fish without polarized light tended to filter feed, while the ones under polarized light would target and strike at prey. Specifically, they tended to strike at a 45-degree pitch angle, which is optimal for utilizing their polarized light sensitivity. Interestingly, the anchovy larvae we have imaged in the northern Gulf of Mexico have a consistent pitch angle of ~45 degrees. I think there is a lot of potential for using imaging to quantify behaviors in the field and coupling those observations with lab experiments using sensory biology to explain the behaviors. The details of the eyes, and the difference among species and families, may have a big influence in the behaviors of visually-oriented predators such as larval fishes.
There were also quite a few experimental talks examining the feeding rates and predator-prey dynamics of larvae using imaging. One even suggested that because early stage larvae were so clumsy at catching copepods that many larvae should be starving. This goes against evidence from net studies showing that most larvae are feeding successfully (they have full stomachs). However, at smaller sizes there could be large portions of the population that are starving, slow, and ultimately succumb to predation. This then begs the question – how do any larvae survive at all? One of the unanswered questions that many research groups are addressing is what conditions lead to successful first feeding (when the larvae go from yolk-sac to exogenous feeding) and fast growth? It could be environmental conditions (temperature, turbulence, etc.), the quality of their food, or a combination of factors.
After the conference I got a chance to go on a whale watch. We must have seen 15-20 orcas. Good thing I brought my camera!
Our new paper was just published in ICES Journal of Marine Science. We showed that a particular scyphomedusa species, Pelagia noctiluca (also known as “the mauve stinger”), was abundant in the northern Gulf of Mexico during two summer seasons (2011 and 2016). This species is normally associated with the Mediterranean Sea, where it can form intense blooms that sting swimmers. In the Gulf of Mexico, however, it lurks below fresher surface waters, so swimmers will not encounter them unless you habitually dive below 10 m. In addition to showing some of the first high-resolution distributions of these organisms, we demonstrated that several other zooplankton groups are less abundant in the vicinity of these medusae, which suggests they have a fine-scale, top-down impact on their prey. Larval and juvenile fishes also tended to aggregate underneath the bells of the Pelagia medusae, but only during the daytime.
I recently finished reading “Enlightenment Now: The Case for Reason, Science, Humanism, and Progress” where Steven Pinker argues that a broad range of measurements of human well-being indicate that humanity is better off now than at any time in history. I found the case to be compelling since Pinker draws his conclusions from large datasets that are often plotted as percentages or somehow correct for the fact that there has been a huge population increase over time. Now more than ever, people are, on average, living longer, safer, and happier lives. Obviously, there is much more to be done to improve the human condition, but we have made a lot of progress, and Pinker claims this progress can be traced back to the widespread embracement of Enlightenment values.
The first part (of 3) defines "the Enlightenment" and examines how it changed humanity’s modes of thinking from beliefs steeped in superstition and dogma to a prevailing belief that, through careful empirical observation, the world is understandable. The most engaging topic of this section was when Pinker introduces some fundamental concepts that the Enlightenment thinkers did not know about. Namely, the concepts of entropy, evolution, and information. All organisms expend energy to slow the relentless push of entropy (the tendency for disorder seen through the gradual breakdown of bodies or cells). In the same vein, societies generate rules or cultural norms that create order (reduce entropy). Evolution has endowed humans with brains optimized to survive in bands of hunter-gatherers – not large complex societies. In this respect, all humans are flawed, but we establish institutions and laws to create functional societies to combat a natural tendency towards entropy. Information, the third concept, leads to knowledge. Pinker writes, “Energy channeled by knowledge is the elixir with which we stave off entropy, and advances in energy capture are advances in human destiny.” With improving knowledge of food production, human societies we able to convert more of the sun’s energy into food with less manual labor. These 3 concepts (entropy, evolution, and information) come up throughout the book and shape a lot of our thinking about the world today.
In the second part of the book, Pinker takes a deep dive into the data, showing that human well-being has increased in almost every way one can imagine. The odd thing is that very few people seem to understand or recognize this fact. He argues that one major reason for this lack of recognition is that entropy happens quickly (on the time scale of a news cycle), while societal improvements (order) happen slowly and are therefore hard to communicate to the masses. Our brains have an innate availability bias (i.e., if examples quickly come to mind, those examples, whether indicative of general trends or not, dramatically shape your opinion on any given subject), which drives a general opinion that the world is spinning out of control or is much worse than it used to be. We hear about more shootings on TV and other news outlets, but the broad trend in societal violence is certainly decreasing in most of the world. There are, of course, exceptions, but Pinker is talking about statistical trends in human well-being, so not every country’s specific circumstances can be addressed in the data he examines.
Part 3 serves as a defense of the Enlightenment values that have led to the many improvements demonstrated in Part 2. Pinker rightly claims that these values have some unexpected enemies from both the extreme right and left sides of the political spectrum. This was the most engaging part of the book, and I found myself repeatedly highlighting and starring chunks of text. There is a lot to ponder in this book, but the main take home message, in my view, is that there are many problems in the world, but these problems have robust solutions if we stick to the Enlightenment values (their success is demonstrated by historical trends of increasing human well-being) and are not tempted by our natural tendencies towards superstition, dogma, and authoritarianism. These tendencies can provide “easy answers” but usually have disastrous consequences. Regardless of your political opinions, it would be worth your time to critically examine some of Pinker’s arguments. Like I said, I agree with his message, but I am also open to an honest discussion and want to hear the best arguments against his thesis.
The reviews I have read seem mixed, but most of the critical ones seem to make the same error when they argue against Pinker’s conclusions: They use anecdotes to say that Pinker’s conclusions are wrong. This is generally a poor way to counter an argument from statistical trends; however, if arguing against something like a scientific theory or law, an anecdote is an effective rebuttal. For example, to disprove evolution or at least dramatically change its status, all you would need to do is find a fossil mammal inside of a Cambrian rock (among many other possibilities that have not been discovered). Also, I do not agree with some reviewer sentiments that Pinker is presenting an overly rosy view of our modern world. He specifically mentions that there are numerous threats which can lead to societal breakdown (entropy), but we do have many long-established institutions to resist such disorder. As I think more about the last part of the book, I hope to write more about these topics.
Last week I attended the Gulf of Mexico Oil Spill and Ecosystem Science Conference (AKA the GoMRI meeting) in New Orleans, along with many colleagues from our consortium (CONCORDE). The meeting is always a good experience that brings together scientists from many different areas of expertise studying the Gulf of Mexico. I had a poster and was co-author on a few talks (see @lavalfishlab on Twitter for some examples), but the highlight of the meeting, at least for me, was getting to share a “Shiny app” that I have been working on over the past few weeks. “Shiny” is a powerful R package that allows you to create plots in a customized and interactive manner, and you don’t need any programming experience to use the apps. Many websites use these apps to create interactive graphics (see showmeshiny.com for examples). My app takes data from the imaging system we use (the ISIIS) and allows you to plot several different oceanographic variables (you choose color and contour data) then overlay the fine-scale abundances of different zooplankton groups. You can also select transect and portions of the transects you want to view. I am hoping to get more data on the app and eventually make it available for use by scientists within the consortium (all data will ultimately be publicly available).
From the biological side of things, many talks focused on the impacts of the oils spill on various plants and animals in the marine ecosystem. Some species show changes in abundance after the spill, but the lack of background data before the oil spill is a major issue when making statements about impacts of any event. Also, the lack of data prevents us parsing out what amount of change is due to the natural variability in species abundance vs. changes due to anthropogenic impacts. Many of these presentations highlight the need to continuous ecosystem monitoring if we are to better understand the interactions between humans and the functioning of the marine ecosystem.
There were also many presentations related to marine snow. For those who do not know, marine snow consists of a mixture of biological material (i.e., dead phytoplankton, fecal pellets, and transparent exopolymers that provide “stickiness” to aggregate the particles), and it is ubiquitous in the marine environment. You can even see it when you are diving or snorkeling if you focus on the tiniest things close to your mask. Marine snow is really interesting in the context of the oil spill because these particles can entrain oil and export it to depth. Understanding how marine snow forms, its composition, and potential to interact with oil and dispersants, is key to understanding how oil can propagate throughout the marine environment. Under normal conditions, marine snow is an important vector for moving carbon out of the euphotic zone to deeper waters (i.e., the ocean biological pump).
The meeting has caused me to think more about how imaging plankton and marine snow can lead to some new insights. In the marine snow area, we can measure properties of the particles in situ that are difficult or impossible to measure in the lab. For zooplankton, there may be some species that are indicators of ecosystem stress, such as gelatinous plankton, that we simply have not been able to detect using traditional plankton sampling gear. With CONCORDE, we have a unique opportunity to address some of these questions, and I am looking forward to digging into the data more over the coming months.
I just finished reading The Extended Phenotype by Richard Dawkins, which admittedly was one of the only books of his I had not read. Part of this delay was due to its similarity in content to his most famous book The Selfish Gene, which I read several years ago. The Extended Phenotype was written with scientists as the intended audience, and it takes some of the ideas in The Selfish Gene to their logical conclusion. The central thesis of the book is that genes generate phenotypes that are not isolated to influencing only the traits within the organism itself. Genes code for all kind of behaviors, sometimes appearing altruistic or wasteful, that help complex organisms increase the probability of propagating their genes to the next generation. Some examples that come up are bowerbirds that build nests as a mere fitness advertisement (the nests are not actually used for egg rearing), and beavers’ dam building behavior that even demonstrates some cooperation among individuals. The point is that the phenotype “extends” out into influencing how the organism interacts with the environment and other organisms. The level at which we choose to observe a phenotype (e.g., during development, adulthood, or behavioral interactions) is an arbitrary one, and Dawkins argues that we need to view evolution as genes acting on multiple phenotypic levels. The line is drawn, however, at behaviors or impacts of behavior (e.g. shape of footprints) that have no value for propagating genes to the next generation.
Particularly powerful examples involve animals whose life cycles depend on other species. In these cases, genes in a parasite, for example, are influencing the behavior of the host, so we can consider the genes as temporarily high jacking the machinery of another species to influence its behavior and get the genes within the parasite to the next generation. These kinds of complex interactions are only explicable with a “gene-centered” view of evolution. By considering the organism as the unit of selection, many behaviors and life cycle strategies can appear counterproductive, and Dawkins makes a pretty convincing case for the power of a “gene-centered” view to explain a variety of phenomena in nature.
The variety of interesting terrestrial examples throughout the book reveals a problem (in my view): there are few marine examples because we simply do not understand marine systems in the same amount of detail. Many of the ideas presented in the book could apply to interactions among different zooplankton groups (e.g., fish larvae and juveniles aggregating near larger gelatinous plankton), many of which have probably not even been described. Are there genes within the early life stages of fishes or other zooplankton that can highjack the machinery of larger animals, allowing them to improve their probability of survival? Or could larvae manipulate a predator’s sensory systems to avoid being eaten? These ideas would be extremely difficult to test, but with improving (and cheaper) sequencing techniques, it is only a matter of time before we discover the genetic components of behavior, and perhaps discover genes that code for traits to locate and utilize other species.
My favorite chapters were the ones about arms races, which I think are particularly applicable to plankton communities, and the last several chapters. There are so many diverse strategies to planktonic existence that there must be intense arms races are difficult to observe (see Smetacek 2001 Nature paper). The last chapters contained the most thought-provoking material. Overall, the book was a rewarding read, and I will probably be re-reading some of the chapters and using some of these examples for course material, should I be teaching ecology or evolution in the future.
This article is a little old, but I thought it was excellent because it hits on a common theme I am seeing in our polarized society: an inability to effectively argue without getting emotional and demonizing the opponent. In this NYT op-ed, Adam Grant essentially says we are losing the effective arguing skill by allowing conflict to lead to anger. He starts by talking about how many collaborations appear to be purely cooperative, where each party involved contributes something in a harmonious way. In fact, many collaborations are characterized by extreme disagreements, where ideas are mercilessly torn apart and reconstructed by the participants, often using a combination of the initial approaches. The key is that these arguments during any creative process must be thoughtful – each party must not take things to a personal level. If things are framed as a debate rather than a fight, participants are less likely to take criticisms or disagreements personally.
I may be biased because I like to argue (I mean "debate"), but I also find many of the most rewarding conversations begin with a disagreement. Scientists are constantly bickering about how data are interpreted, and there are often no concrete answers about the best approaches to a problem or interpretations of data. However, these conversations do open everyone up to their own biases or potential holes in their interpretations, which generally leads to better decision-making further down the road.
I admire people who can lay out a rational argument on an emotionally charged subject in a calm and collected manner. Not only that, but the most effective debaters take the time to understand the ends and outs of another person’s position. To argue in a sense that does not create a straw-man out of the opponent, but genuinely attempts to summarize the differences in the respective positions, can help fortify your own position by understanding how the other person thinks and what holes they are most likely to poke in your argument.
Above all, the article points out that debate is messy but necessary for any creative solutions to complex problems. I think we can all take a few lessons from this piece and remember to not take attacks personally but, above all, try to thoughtfully criticize, as opposed to demonize, people that we disagree with.
Katrina Aleksa successfully defended her PhD dissertation on Wednesday, making her the newest PhD in the USM Division of Marine Science. Her project examined the ecology and behavior of leatherback sea turtles in the Gulf of Mexico. Some exciting findings were that these large predators of gelatinous organisms tended to forage near sea-surface lows, which are generally sites of upwelling and anomalously high biological productivity (determined from a combination of satellite tags on turtles and remote sensing data). Turtles also foraged near the shelf break, especially around the Florida panhandle. These findings have direct applications to conservation of these large charismatic animals that make massive reproductive and foraging migrations. Congratulations to Katrina! Be on the look out for more publications from her dissertation coming soon.
We just got a new paper published in Fisheries Oceanography, which you can read here (or contact me for a pdf). In short, we were able to provide a quantitative description of the association of lobster phyllosoma with jellies (also known as gelatinous zooplankton). This is a fascinating relationship where the lobster larvae attach to gelatinous zooplankton of varying sizes (sometimes one individual phyllosoma attaches to several jellies simultaneously) and likely use them as both a floating shelter and a food resource. During the fall in the Gulf of Mexico, the phyllosoma were abundant, and ~30% of them were attached to gelatinous zooplankton, with a higher probability of attachment further from shore (toward the south). This kind of species interaction can only be revealed through in situ imaging and likely has some evolutionary benefit for the life history of lobsters.
I finished reading this book about a month ago, but it has taken me some time to organize my thoughts and decide how to summarize such a thorough and entertaining piece of non-fiction. If you don't read any more of this post, my take home message is: go read the book! Even if you have never been to the Gulf of Mexico, there are so many important lessons throughout its history.
It is rare that you get to read a book that covers so many different and often dense subjects in such an effortless and entertaining manner, but that is exactly what Jack E. Davis has accomplished in “The Gulf: The Making of an American Sea.” The book brings to life historical figures that shaped the Gulf and weaves in fascinating information about the ecology and geology that make the Gulf coast such a biologically rich area.
The first chapters summarize the various European expeditions made to map and explore the Gulf coast. In many cases, the first navigators of the Gulf had no idea what they were doing – errors were made mapping the location of the Mississippi River, and Spanish settlers did not know how to live off the productive estuaries as the Native Americans did. This lack of Gulf survival skills came from the settlers’ inability to observe and adapt to this landscape that was very different from their homeland. The chapters often mention an ethologist named Cushing, and Davis uses a unique writing style to summarize information through the lens of a sleuth uncovering clues about how the native cultures lived.
A character that periodically comes up is the artist Walter Anderson who lived in Ocean Springs, Mississippi, and the author regards Anderson as one of the first Gulf coast naturalists. Anderson loved the barrier islands and would make the 12-mile paddle out to them regularly, often staying for several days. He was particularly fascinated with Horn Island, and even survived a hurricane there. While residing on the island, he would paint the natural beauty and became acutely aware of changes caused by pollution. I can’t wait to take a trip out to Horn Island (the largest Mississippi barrier island) at some point.
In many ways, the history of the Gulf is a series of tragedies. I found myself wanting to go back in time to see the expansive, pristine pine forests that once thrived along the Gulf coast before shipbuilding led to their demise. During the bird feather fashion craze in the late 19th century, large birds, such as herons and egrets, were mercilessly shot along the Gulf coast. Upon the sound of gunfire, the parents would instinctively guard their nests, making them easy targets for slaughter. Mangroves were unwisely destroyed to make room for coastal development – people were not aware of their critical ecological role at the time. But every time the Gulf ecosystem seemed on the verge of destruction, a hero emerged to stand up for the thing that originally drew people to the Gulf coast: its natural beauty.
The most concerning and relevant material (when it comes to environmental management) comes in the final chapters where we learn about various instances when long term environmental sustainability was sacrificed for short term economic gain. This is common theme along the Gulf coast (Louisiana, in particular) as well as all over the country. I enjoyed the scientific history of the discovery of the Gulf of Mexico “dead zone” (area of low dissolved oxygen bottom waters) and its mechanisms of formation. It is a bit sad because we have known about the dead zone for decades now (as well as the processes that influence it), yet it continues to expand in size. This summer of 2017 has the largest dead zone on record. The book overall is an extremely valuable, holistic treatment of environmental history, and I hope it will be read by many, so we can learn from the mistakes of the past and preserve the Gulf for future generations.
It is pretty common knowledge that young people today are reading and writing more than ever, but it is often in an unstructured way – the kind of writing used in text messages or on Twitter. Teachers have taken note of this change in style, and they have documented a decline in writing skill (75% of 12th and 8th graders are not proficient in writing). There is a push toward developing new methods to help kids learn to write well, which seems like a daunting task. In this New York Times article, Dana Goldstein focuses on the stories of students and teachers, and what it takes to develop writing skills.
Speaking from my own experience, I remember in high school being told to free-write or draw on something from my life to inspire the written word. To me, this was not particularly helpful to develop writing skills, and the teachers in Dana Goldstein’s article agree that free-writing has not improved kids’ abilities. Writing for most of my early life was a difficult process, and I really did not come to enjoy it until college. The difference was that the writing became more goal-oriented. There was a purpose or an argument that I was striving to make, and that process – crafting the right way to organize and present evidence to build an argument – became fun, and it drove me to hone my writing skills.
So what makes good writer? Certainly some people have innate writing ability, but the most important thing that anyone needs to understand is that good writing takes hard work. Some of my most valuable writing experiences came from having a professor read a draft of a manuscript and completely rip it to shreds, metaphorically speaking, which required me to take a step back and look at the big picture goals of the manuscript at hand. Even though I probably thought I had decent writing ability at the time, my writing was not clear or well organized. Once I had some paragraphs written, I felt a sort of attachment to those words, almost like a sunk-cost fallacy, a desire to hold onto something that has taken a substantial amount of time and effort. A significant hurdle involved just generating the will to, in some cases, completely remove paragraphs and start the text fresh. Sometimes this is what it takes to create high quality prose, and any good writer has to also have a thick skin and be open to criticism.
For people interested in writing about science and other non-fiction, I highly recommend Steven Pinker’s book “The Sense of Style: The Thinking Person’s Guide to Writing in the 21st century.” If you read this or his other books, you will see that he is able to write clearly about complex subjects, which is often a struggle for me and other scientists. I especially enjoyed the parts of the book where he presents a few paragraphs from another author and goes through, in detail, how the author fails or succeeds in making his or her point. We can certainly learn a lot from just hearing how good writers critique others.
Computer vision, a form of Artificial Intelligence (AI) that involves computers extracting information from images, has tons of potential applications to all sorts of business and scientific needs. It is not surprising that various groups are investing in development of these techniques, but, as Gary Marcus points out in a New York Times op-ed, most of these approaches are bottom-up, crunching huge amounts of data on pixel color and pattern (i.e., AI as a “passive vessel”) to discern content or classify the image. At the same time, the approaches are confined to small groups in labs or companies that have little incentive to share their breakthroughs with the outside world. Another technical problem is that these approaches can produce incorrect results for reasons that are hard for a human to identify because they often come from multiple processing steps that are difficult to trace.
“To get computers to think like humans, we need a new A.I. paradigm, one that places “top down” and “bottom up” knowledge on equal footing. Bottom-up knowledge is the kind of raw information we get directly from our senses, like patterns of light falling on our retina. Top-down knowledge comprises cognitive models of the world and how it works.”
Marcus calls for approaches to AI that utilize more top-down approaches – that is, incorporate the strengths of human intelligence into the AI framework. More data (bottom-up) do not necessarily lead to a better decision, especially if that decision involves complex thought, such as considering image context or future actions of objects.
I wholeheartedly agree with Gary Marcus’s position based on my own experiences in the computer vision world. I am not a computer scientist, but I have been working for years on plankton imaging and the automated analysis of the images, so I have a general familiarity with the approaches to analyze image data. Currently, there is a push within the image processing world towards “deep learning” techniques, which is a form of AI and appears to be similar to previous approaches to recognize plankton images – extract as much data as possible and categorize based on some training set that creates a model for the algorithm to follow. Over time working with results of various computer classification techniques, I have developed a new, but profound respect for the human brain. We truly are image processing wizards – we easily look at an image in 2D and can interpret it in 3D, and we also have incredible skill for understanding the context of the image – two things that are difficult to communicate to a computer. The reason for this difficulty is that these image processing instincts humans possess are not easily translatable to a computer, which works in mathematical terms. How do you describe mathematically that an object has multiple orientations toward the camera, and all of these orientations should be considered the same type of object? This is not trivial to implement in a computer program, but it is quite easy for our brains to accomplish this task.
Our ability to construct “cognitive models of the world” leads to numerous mental shortcuts that are accurate and are computationally inexpensive. For example, within the study of predator-prey interactions and Batesian mimicry, there is an idea of “feature saltation”, which essentially means that a predator uses one or two visual traits to assess whether a potential prey item is palatable or threatening. This is exactly what humans do to recognize objects. We assess the overall shape, which computers do quite well, but then we cue in on features of the images (e.g., lighting, positioning of eyes, stems, etc.). Once we see one or two relatively subtle things, we can typically make a positive and accurate identification - as well as say something about what may be happening in the image. From a computer’s perspective, it is difficult to cue in on specific features, which is why deep learning algorithms can periodically be “tricked” in non-intuitive ways. Marcus mentioned an example of a deep learning algorithm mistaking a pattern of yellow and black stripes for a school bus.
I hope the AI community takes some of these suggestions to heart because this is an exciting field that potentially could progress faster if we change a few approaches. Although he doesn’t say this explicitly, I believe Marcus would agree that we need more research conducted on how human (and other animal) brains construct these "cognitive models", which will help computer scientists more accurately incorporate this top-down knowledge into AI.
About a week ago, the University of Southern Mississippi held an "open ship", allowing members of the public come aboard the RV Point Sur, meet some scientists, and learn about the research being done in their backyard at the School of Ocean Sciences and Technology. Lucho Chiaverano and I represented the plankton team and showed some ISIIS images and plankton samples preserved in ethanol. Most of the younger ones were naturally drawn to the shark jaws and giant whale vertebra that scientists from GCRL brought on board, but I think people walked away with a some appreciation for the little guys in the sea. I was super happy with the crowd turnout (pretty good for midday Monday), and the local news did a short story about the ship and visitors. It was great to meet people interested in marine science, and hopefully we can have another open ship soon!
Tomorrow I will be giving a 30 minute presentation about CONCORDE research for the U.S. Coast Guard Sector Mobile, Mississippi Area Committee Meeting. The talk will take place at the Grand Bay National Estuarine Research Reserve and will cover the main objectives from our research consortium, along with various applications of the findings toward oil spill mitigation. I am in the final stages of completing an overview paper summarizing some results from CONCORDE, so I have tried to adapt the content of that paper into a talk for a more general audience. I am not totally sure how it will go over, but I am excited for the opportunity to relate our work to some real-world applications in the community.
I am now reading The Gulf: The Making of an American Sea by Jack E. Davis. As an environmental history professor at the University of Florida, Dr. Davis provides an overview of the stories and people that had the most impact on the Gulf of Mexico, and, conversely, how the Gulf affected native cultures and settlers alike. I will post a full review once I am done with the book.
Congratulations to Dr. Brian Dzwonkowski of the University of South Alabama Dauphin Island Sea Lab for publishing a new paper in Continental Shelf Research! This paper describes the biogeochemical response of shelf waters to a meteorological flushing event that occurred just after the remnants of Hurricane Patricia passed over the northern Gulf of Mexico. At the edge of a freshwater influenced region, we documented a dense aggregation of Trichodesmium (Nitrogen fixing bacteria) that had apparently exploited an ecological niche in this relatively small area, as indicated by low N:P ratios and relatively high stratification in salinity. This is one of the first studies to be published as part of the CONCORDE consortium and provides insight into biological and physical coupling in this region. Stay tuned for more publications to come from our group!
My new website is live. Updates on research and news will happen here.