first_imgA stormy night in Donegal has led to difficult driving conditions in various parts of the county.Heavy rainfall has left large amounts of surface water on the roads.The AA Roadwatch is reminding drivers that it takes longer to stop on wet roads, so slow down and keep further back from the vehicle in front. Local reports coming from East Donegal are warning that spot flooding between Rossgier and Porthall is causing dangers.The Status Yellow Wind and Rain warnings have been lifted in the north west, but drivers should be on alert for wind-blown debris.“Be particularly mindful of vulnerable road users such as cyclists, motorcyclists and pedestrians,” said AA Roadwatch advisors.  Motorists warned to take care due to wet roads was last modified: December 18th, 2018 by Rachel McLaughlinShare this:Click to share on Facebook (Opens in new window)Click to share on Twitter (Opens in new window)Click to share on LinkedIn (Opens in new window)Click to share on Reddit (Opens in new window)Click to share on Pocket (Opens in new window)Click to share on Telegram (Opens in new window)Click to share on WhatsApp (Opens in new window)Click to share on Skype (Opens in new window)Click to print (Opens in new window)last_img read more

first_imgThis week’s warm and sunny weather forecast in most parts of the state likely has many digging out their garden gloves and playing in the dirt. But before you head to a big-box store to purchase your seeds and plants, the state’s land conservancies and family farms hope you’ll consider heirloom varieties.Nancy Long and her husband, Harold, of Long Family Farms and Gallery, exclusively plant vegetables on their farm in Cherokee County that was passed down from the Eastern Band of Cherokee.“There’s the ability to share with others by sharing the seeds and the stories and the memories,” she says. “Like, it might be, ‘Oh, these were grandpa’s butter beans or grandma’s favorite tomato.’ All these seeds have so many different stories and the memories that go along with it.”Harold Long, a member of the Cherokee, recently traveled to Oklahoma to retrieve Cherokee tan pumpkin seeds, once thought lost but found on a farm there. They’ve since shared the seeds through an outreach program. The Mainspring Conservation Trust has conservation easements on four farms in Cherokee and Clay counties to help preserve farms such as the Longs’.Land conservancies are able to preserve farmland through the USDA’s Natural Resources Conservation Service Farm Bill, currently up for renewal in Congress.Sara Posey is the Hiwassee Programs Manager for the Mainspring Conservation Trust and says beyond the importance of preserving horticulture history, maintaining a variety helps protect the food supply.“When we’re in a monoculture, we are susceptible to insect blights, and if we only have one strain of a plant, then it’s gone and that’s the resurgence with heirloom,” she explains. “They are all genetically different.”Long says people who have heirloom plants and vegetables on their own land can help secure the long-term history of varieties for generations to come.“If more people would take an interest and have seed exchanges, more of the seeds would be able to get into other people’s hands, so really in order to save a seed, you have to share them,” Long adds.Recently, the University of North Carolina-Asheville hosted a seed exchange. The Carolina Farm Stewardship Association has a similar program.last_img read more

first_imgAs 2016 is the year of a presidential election, it was quite fitting that SHRM’s Annual Conference and Exposition was held in Washington D.C.  While storm clouds may continue to gather and brew on our political horizon, the weather and atmosphere for SHRM was quite the opposite.We are lucky that the HR and Benefits industry has SHRM because it doesn’t just attract industry professionals seeking to improve themselves – but to improve others and the industry we are in.What better example of that than a presentation by Sal Khan of the Khan Academy. Sal spoke about the humble beginnings of his career as an educator – starting off as a means to tutor his cousin online in mathematics – and growing into what’s known now as the Khan Academy. The academy video channel has millions of subscribers and his work has led to him being listed by Time as amongst the 100 most influential people in the world in 2012.This got me thinking about our roles as HR leaders and employee benefit advisors and what makes our industry so special. Much like Sal, we too are driven by a desire to improve the human elements of organizations across the country. At HUB International, for example, many of our employees specialize in communicating and educating employees to better understand their benefits options as the ACA continues to turn much of the benefits world, as we knew it, upside down. Flunking your math test will lead to a bad grade average. But aren’t the potential consequences for an employee flunking their benefits enrollment by selecting a bad plan choice far, far, greater? The importance of SHRM’s work, and our work as industry professionals and educators, goes on.Another terrific (and very topical) highlight of the event was a heated ‘crossfire session’ between CNN commentator Paul Begala and Fox commentator and Daily Caller editor, Tucker Carlson, on the coming presidential election.  Perhaps inevitably, their opinions on what the answers to some of the looming health care compliance issues we face differed greatly. Whatever your opinion of our presidential candidates – and the likely winner, I’m sure you’ll agree that the need for cool heads and our services as educators will remain whoever wins.I have the great honor of being a co-chairman of SHRM’s next conference in 2017 – an event that will be held here in my hometown of New Orleans, Louisiana.  As you’d expect, the Louisiana Chapter of SHRM is eager to put on a great show of charm and hospitality for everyone that can make it down here. Whatever storm clouds pass over D.C. this November, we can all look forward to a tactical refresh and regrouping of our craft in The Big Easy.Jim Casadaban, MBASenior Vice President, HUB International Gulf Southlast_img read more

first_imgThe Next Generation Security Leader (NGSL) program was developed by the Security Executive Council (SEC) in collaboration with industry thought leaders and security practitioners to create a curriculum intended to guide the next generation of strategic thinking.Approved and tested on six continents with the industry acclaimed University of South Carolina’s Darla Moore School of International Business, this program delivers: • A strategic business approach to all-hazards risk mitigation • Access to the latest research and proven practices presented by our Solution Innovation Partners • A collaborative influence platform that benefits Security leaders and their functional colleagues such Human Resources, Operations, Marketing, and Finance • A persuasive value proposition for your organization including the modeling and creation of a program case study for senior executives. • Access to a highly leveraged network of current and former risk and security executivesMany security practitioners have relied on the 3 C’s – compliance, crisis responseand complacency – in lieu of continuous improvement. Francis D’Addario, emeritus faculty for Strategic Influence and Innovation for the Security Executive Council and former vice president of Partner and Asset Protection for Starbucks Coffee,states: “In this atmosphere of global risk, our sometimes poor decisions have resulted in a short supply of long-term security results producers and successors. Our next-generation leadership maybe our last, best chance to compensate for dwindling resources.”- Sponsor – “Our research shows that much of a security leader’s success revolves around communication and receptiveness,” says Kathleen Kotwica, EVP and Chief Knowledge Strategist for the Security Executive Council.Approved for up to 9 CPE credits with ASIS International, attendees will come away from this conversation with actionable ideas they can interject into their programs.This is an event for security practitionerscreated by security practitioners. To learn more about NGSL and to see the agenda for the next event, please visit: https://www.securityexecutivecouncil.com/spotlight/?sid=29527. Stay UpdatedGet critical information for loss prevention professionals, security and retail management delivered right to your inbox.  Sign up nowlast_img read more

first_imgIn Pt1 of this blog post I looked at a SQL Query and data set to run in Hadoop and in Pt2 wrote the Map function to extract the relevant fields from the data set to satisfy the query. At this point however we still have not implemented any of the aggregate functions and still have a large key and value intermediate data set. The only data eliminated so far has been the lines examined where the date was not less than or equal to 11-AUG-98. On the test data set out of the initial 600037902 lines of data we now have 586996074 lines remaining, to complete the query we now need to write the reduce phase. The Reduce method will extend the Reducer class. This needs to accept the intermediate key value pairs output by the mapper and therefore will receive as input the key which is fields 9 and 10  concatenated and the DoubleArrayWritable containing the values. For every key we need to iterate through the values and calcuate the totals required for the SUM(), AVG() and COUNT() functions. Once these have been calculated we can format the output as text to be written to a file that will give us exactly the same result as if the query had been processed by a relational database. This reduce phase will look something as follows by simply adding all of the values in the array for the SUM() functions and then dividing by the COUNT() value to calculate the result of the AVG() functions.nfor (DoubleArrayWritable val : values) {x = (DoubleWritable[]) val.toArray();sum_qty += x[0].get();sum_base_price += x[1].get();sum_discount += x[2].get();count_star += x[3].get();sum_disc_price += x[4].get();sum_charge += x[5].get();        }avg_qty = sum_qty/count_star;avg_price = sum_base_price/count_star;avg_disc = sum_discount/count_star;/* Format and collect the output */Text tpchq1redval = new Text(” “+sum_qty+” “+sum_base_price+” “+sum_disc_price+” “+sum_charge+” “+avg_qty+” “+avg_price+” “+avg_disc+” “+count_star);       context.write(key, tpchq1redval);       }  }nCoupled with the Map phase and a Job Control section (this will be covered in the next post on running the job) this Job is now ready to run. However as we have noted previously just for our 100GB data set the map phase will output over 58 million lines of data which will involve a lot of network traffic and disk writes. We can make this more efficient by writing a Combiner.The Combiner also extends the Reducer and in simple cases but not all (as we will cover in a moment) can be exactly the same as the Reducer. The aim of the combiner is to perform a Reducer type operation on the subset of data produced by each Mapper which will then minimise the amount of data that needs to be transferred throughout the cluster from Map to Reduce. The single most important thing about the Combiner is that there is no certainty that it will run. It is available as an optimization but for a particular Map output it might not run at all and there is no way to force it to run. From a development perspective this has important consequences, you should be able to comment out the line in the Job Control section that calls the Combiner and the result produced by the MapReduce Job stays exactly the same. Additionally the input fields for the Combiner must be exactly the same as expected by the Reducer to operate on the Map output and the Combiner output must also correspond to the input expected by the Reducer.  If you Combiner does not adhere to these restrictions your job may compile and run and you will not receive an error, however if not implemented correctly your results may change on each run from additional factors such as changing the imput block size. Finally the Combiner operation must be both commutative and associative. In other words the Combiner operation must ensure that both changing the order of the operands as well as the grouping of the operations you perform does not change the result. In our example the SUM() function is both commutative and associative, the numbers can be summed in any order and we can perform the sum operation on different groups and the result will always remain the same. AVG() on the other hand is commutative but not associative. We can calculate the average with the input data in any order, however we cannot take an average of smaller groups of values and then take the average of this intermediate data and expect the result to be the same. For this reason the Combiner can perform the SUM() operation but not the AVG() and can look as follows producing the intermediate sum values only for the Reducer.nfor (DoubleArrayWritable val : values) { x = (DoubleWritable[]) val.toArray();sum_qty += x[0].get();sum_base_price += x[1].get();sum_discount += x[2].get();count_star += x[3].get();sum_disc_price += x[4].get();sum_charge += x[5].get();  }outArray[0] = new DoubleWritable(sum_qty); outArray[1] = new DoubleWritable(sum_base_price); outArray[2] = new DoubleWritable(sum_discount); outArray[3] = new DoubleWritable(count_star); outArray[4] = new DoubleWritable(sum_disc_price);outArray[5] = new DoubleWritable(sum_charge);DoubleArrayWritable da = new DoubleArrayWritable();da.set(outArray);context.write(key, da);     }  nAt this stage we have written the Mapper, Reducer and Combiner and in Pt4 will look at adding the Job Control section to produce the completed MapReduce job. We will then consider compiling and running the job and tuning for performance.last_img read more

first_imgIn 1945, a year after the Allies stormed the beaches of Normandy, the ocean was busy establishing a beachhead of its own, burrowing beneath a fortified section of an important Antarctic glacier. Pine Island Glacier, a Texas-sized, 2-kilometer-thick ice sheet (pictured), is the linchpin of the rapidly disappearing West Antarctic Ice Sheet—one of the largest drivers of uncertainty for sea level rise this century. No glacier has lost more water to the ocean in recent years: It is thinning by more than a meter each year as warm ocean water creeps in and melts it from underneath, hollowing out a vast ice shelf. Now, scientists can trace the beginning of this accelerated melting to a surge of warming in the Pacific Ocean more than 70 years ago. Researchers already knew that in the 1970s the glacier lost contact with an undersea ridge that had held ocean water at bay. But how long did it take for the ocean to worm its way through? Working in remote conditions, researchers in the winter of 2012 ran a drill through 450 meters of ice and 500 meters of ocean to collect seafloor sediments on either side of this lost bulwark. Analyzing and dating these rocks, they found that ocean water began to appear on the ridge’s land-facing side in 1945, even as the ice sheet remained grounded on the ridge’s summit, scientists report online today in Nature. Furthermore, they found that the incursion of ocean water followed a notably warm El Niño in the Pacific Ocean between 1939 and 1942. It would be nearly a half-century before the oceans around Antarctica saw such warmth again. Yet the water in the cavity never refroze, suggesting that the melting of some ice sheets will be difficult to reverse, even if human-driven warming is curbed.last_img read more