Masashi Ozaki aka Jumbo is the most successful golfer ever on the Japan Golf Tour. He has 40 more wins than the next highest player. He was nicknamed Jumbo because of his power off the tee and his size (5’11” 198lbs)
Japanese golf icon Masashi “Jumbo” Ozaki, widely regarded as the greatest player in the nation’s history, passed away on Tuesday after a battle with sigmoid colon cancer. He was 78.
The Japan Golf Tour Organization (JGTO) confirmed the news on Wednesday, noting that Ozaki had been diagnosed with the disease approximately one year ago. A family funeral has been held privately, with plans for a public farewell event to be announced in the future.
Born on January 24, 1947, in Tokushima Prefecture, Ozaki initially pursued a career in professional baseball, pitching and playing outfield for the Nishitetsu Lions (later Seibu Lions) from 1965 to 1967. At age 23, he transitioned to golf, turning pro in 1970 and quickly establishing himself as a dominant force.
Nicknamed “Jumbo” for his imposing 181 cm, 90 kg frame and booming drives—evoking the Boeing 747 jumbo jet that debuted around the same time—Ozaki amassed an unparalleled record. He secured 94 victories on the Japan Golf Tour, the most in its history, along with additional wins for a career total exceeding 110 tournaments (sources vary slightly between 112 and 114). His triumphs included five Japan Open titles and six Japan PGA Championships.
Ozaki’s charisma shone through dramatic comebacks, including four victories where he erased eight-shot deficits. “What made him charismatic was the fact that he won four times in which he came back from eight shots behind,” the JGTO has noted on its website. “He pulled off some incredible shots a number of times.”
Internationally, Ozaki made his mark early, becoming the first Japanese golfer to finish in the top 10 at the Masters Tournament with an eighth-place result in 1973. He competed in 19 Masters, 13 U.S. Opens, and represented the International Team at the 1996 Presidents Cup. His best major finish outside Japan was a tie for sixth at the 1989 U.S. Open, and he reached a career-high world ranking of No. 5.
He claimed the Japan Golf Tour money title a record 12 times, including a streak of five consecutive seasons from 1994. At 55, he became the tour’s oldest winner by triumphing at the 2002 ANA Open.
In 2011, Ozaki was inducted into the World Golf Hall of Fame, joining Isao Aoki as the only Japanese men to receive the honor. “Ozaki is often thought to be to Japanese golf what Arnold Palmer is to American golf,” the Hall of Fame website states. “His success has spawned an entire generation of Japanese golf professionals, both male and female.”
Upon his induction, Ozaki reflected: “I am very happy, very honoured and appreciate everyone who has supported me since I turned pro in 1970. My only regret is not playing more outside of Japan, but I dedicated my life to Japanese golf and am extremely grateful the voters thought I was worthy of this honour.”
Ozaki came from a golfing family; his younger brothers Tateo (“Jet”) and Naomichi (“Joe”) also enjoyed successful professional careers, ranking among the tour’s all-time money leaders.
The golf world has mourned the loss of a pioneer whose power, personality, and perseverance elevated the sport’s popularity in Japan and inspired countless players worldwide.
Dick Cheney, America’s most powerful modern vice president and chief architect of the “war on terror,” who helped lead the country into the ill-fated Iraq war on faulty assumptions, has died, according to a statement from his family. He was 84.
“His beloved wife of 61 years, Lynne, his daughters, Liz and Mary, and other family members were with him as he passed,” the family said, adding that he died due to complications of pneumonia and cardiac and vascular disease.
“Dick Cheney was a great and good man who taught his children and grandchildren to love our country, and to live lives of courage, honor, love, kindness, and fly fishing,” the family added.
“We are grateful beyond measure for all Dick Cheney did for our country. And we are blessed beyond measure to have loved and been loved by this noble giant of a man.”
The 46th vice president, who served alongside Republican President George W. Bush for two terms between 2001 and 2009, was for decades a towering and polarizing Washington power player.
Bush described Cheney in a statement Tuesday as a “decent, honorable man.” “History will remember him as among the finest public servants of his generation – a patriot who brought integrity, high intelligence, and seriousness of purpose to every position,” Bush said.
In his final years, Cheney, still a hardline conservative, nevertheless became largely ostracized from his party over his intense criticism of President Donald Trump whom he branded a “coward”and the greatest-ever threat to the republic.
In an ironic coda to a storied political career, he cast his final vote in a presidential election in 2024 for a liberal Democrat, and fellow member of the vice president’s club, Kamala Harris, in a reflection of how the populist GOP had turned against his traditional conservatism.
Cheney was plagued by cardiovascular disease for most of his adult life, surviving a series of heart attacks, to lead a full, vigorous life and lived many years in retirement after a heart transplant in 2012 that he hailed in a 2014 interview as “the gift of life itself.”
Cheney, a sardonic former Wyoming representative, White House chief of staff and defense secretary, was enjoying a lucrative career in the corporate world when he was charged by George W. Bush with vetting potential vice-presidential nominees. The quest ended with Cheney himself taking the oath of office as a worldly number two to a callow new president who arrived in the Oval Office after a disputed election.
While caricatures of Cheney as the real president do not accurately capture the true dynamics of Bush’s inner circle, he relished the enormous influence that he wielded from behind the scenes.
Cheney was in the White House, with the president out of town on the crisp, clear morning of September 11, 2001. In the split second of horror when a second hijacked plane hit the World Trade Center in New York, he said he became a changed man, determined to avenge the al Qaeda-orchestrated attacks and to enforce US power throughout the Middle East with a neo-conservative doctrine of regime change and pre-emptive war.
“At that moment, you knew this was a deliberate act. This was a terrorist act,” he recalled of that day in an interview with CNN’s John King in 2002.
Cheney reflected in later years on how the attacks left him with overwhelming sense of responsibility to ensure such an assault on the homeland never happened again. Perceptions however that he was the sole driving force behind the war on terror and US ventures into Iraq and Afghanistan are misleading.
Contemporary and historic accounts of the administration show that Bush was his own self-styled “The Decider.”
From a bunker deep below the White House, Cheney went into crisis mode, directing the response of a grief-stricken nation suddenly at war. He gave the extraordinary order to authorize the shooting down of any more hijacked airliners in the event they were headed to the White House or the US Capitol building. For many, his frequent departures to “undisclosed” locations outside Washington to preserve the presidential chain of succession reinforced his image as an omnipotent figure waging covert war from the shadows. His hawkishness and alarmist view of a nation facing grave threats was not an outlier at the time – especially during a traumatic period that included anthrax mailings and sniper shootings around Washington, DC, that exacerbated a sense of public fear even though they were unrelated to 9/11.
The September 11 attacks unleashed the US war in Afghanistan to overthrow the Taliban, which was harboring al Qaeda, though the terror group’s leader Osama bin Laden escaped. Soon, Cheney was agitating for widening the US assault to Iraq and its leader, Saddam Hussein, whose forces he had helped to eject from Kuwait in the first Gulf War as President George H.W. Bush’s Pentagon chief.
The vice president’s aggressive warnings about Iraq’s supposed weapons of mass destruction programs, alleged links to al Qaeda and intent to furnish terrorists with deadly weapons to attack the United States played a huge role in laying the groundwork for the US invasion of Iraq in 2003. Congressional reports and other post-war inquiries later showed that Cheney and other administration officials exaggerated, misrepresented or did not properly portray faulty intelligence about weapons of mass destruction programs that Iraq turned out not to possess. One of Cheney’s most infamous claims, that the chief 9/11 hijacker Mohamed Atta, met Iraqi intelligence officials in Prague, was never substantiated, including by the independent commission into the September 11 attacks.
But Cheney insisted in 2005 that he and other top officials were acting on “the best available intelligence,” at the time.
While admitting that the flaws in the intelligence were plain in hindsight, he insisted that any claim that the data was “distorted, hyped, or fabricated” was “utterly false.”
The conflicts in Iraq and Afghanistan also led the US down a dark legal and moral path including “enhanced interrogations” of terror suspects that critics blasted as torture. But Cheney – who was at the center of every facet of the global war on terrorism – insisted methods like waterboarding were perfectly acceptable.Cheney was also an outspoken advocate for holding terror suspects without trial at Guantanamo Bay, Cuba – a practice that critics at home and abroad branded an affront to core American values.
Cheney became a symbol of the excesses of the anti-terror campaigns and the fatally false premises and poor planning that turned the initially successful invasion of Iraq into a bloody quagmire. He left office reviled by Democrats and with an approval rating of 31%, according to the Pew Research Center.
To the end of his life, Cheney expressed no regrets, certain he had merely done what was necessary to respond to an unprecedented attack on the US mainland that killed nearly 3,000 people and led to nearly two decades of foreign wars that divided the nation and transformed its politics.
“I would do it again in a minute,” Cheney said, when confronted by a Senate Intelligence Committee report in 2014 that concluded enhanced interrogation methods as brutal and ineffective and responsible for damaging US standing in the eyes of the world.
Of the Iraq war, he told CNN in 2015: “It was the right thing to do then. I believed it then and I believe it now.”
Cheney’s aggressive anti-terror policies fit into a personal doctrine that justified extraordinary presidential powers with limited congressional oversight. That was in line with his belief that the authority of the executive branch had been mistakenly eroded in the aftermath of the Vietnam War and the Watergate scandal that led to the resignation of his first presidential boss, President Richard Nixon.
Yet in his final years, Cheney emerged as a fierce critic of a man who had an even more expansive view of the powers of the presidency than he did – Trump. Cheney had supported Trump in 2016 despite his criticism of Bush-Cheney foreign policies and his transformation of the party of Reagan into a populist, nationalist GOP. But the ending of the president’s first term, when his refusal to accept his 2020 election defeat led to the January 6 insurrection, caused Cheney to speak out, in a rare, public manner.
The former vice president’s daughter, then-Wyoming Rep. Liz Cheney, meanwhile, sacrificed a promising career in the GOP to oppose Trump after his attempt to overturn the 2020 presidential election and the US Capitol insurrection on January 6, 2021. In an ad for his daughter’s unsuccessful campaign to fight off a pro-Trump candidate’s primary challenge in 2022, Dick Cheney – who was, by then, rarely seen in public – looked directly into the camera from under a wide brimmed cowboy hat and delivered an extraordinary direct message.
“In our nation’s 246-year history, there has never been an individual who is a greater threat to our republic than Donald Trump,” Cheney said.
“He is a coward. A real man wouldn’t lie to his supporters. He lost his election, and he lost big. I know it. He knows it, and deep down, I think most Republicans know.”
Richard Bruce Cheney was born January 30, 1941, in Lincoln, Nebraska. While living in the small mountain town of Casper, Wyoming, he met his high school sweetheart and future wife Lynne Vincent. Cheney was accepted to Yale University on a scholarship, but he struggled to fit in and maintain his grades. By his own admission, he was kicked out.
He returned West to work on power lines and was twice arrested for driving under the influence. In a turning point for Cheney, he was given an ultimatum from Lynne, who had “made it clear she wasn’t interested in marrying a lineman for the county,” he told The New Yorker. “I buckled down and applied myself. Decided it was time to make something of myself,” he told the magazine.
Cheney went back to school and earned a bachelor’s and master’s in political science from University of Wyoming. The couple was married in 1964.
Cheney is survived by Lynne, his daughters Liz and Mary Cheney and seven grandchildren.
A veteran Washington power broker
Cheney began honing his inside power game – at which he became a master – as an aide to Nixon.
He was later picked by Donald Rumsfeld as his deputy White House chief of staff under President Gerald Ford and then succeeded his mentor and close friend in the job in 1975 when Rumsfeld departed to become defense secretary. Cheney was instrumental in reviving their partnership in 2001 when he recalled Rumsfeld from the political wilderness to return to the Pentagon. The pair formed an extraordinary backroom alliance in the Bush administration throughout the war on terror and the Iraq war – much to the frustration of more moderate members of the administration including then-Secretary of State Colin Powell and National Security Adviser Condoleezza Rice – who took over from Powell in the second term.
While Democratic President Jimmy Carter was in the White House, Cheney decided to run for Congress and was elected to Wyoming’s sole US House seat in 1978. Cheney served six terms, rising to become House minority whip, and racked up a very conservative voting record.
In 1989, President George H. W. Bush, who had served with Cheney in the Ford administration, tapped him to serve as his defense secretary, calling him a “trusted friend, adviser.” He was confirmed by the Senate in a 92-0 vote.
As Pentagon chief, Cheney showed considerable skill in directing the US invasion of Panama in 1989 and Operation Desert Storm in 1991 to push Iraq’s troops out of Kuwait. Following his stint as defense secretary, Cheney briefly explored a run for president in the 1996 election cycle but decided against it.
During Democrat Bill Clinton’s presidency, Cheney joined Dallas-based Halliburton Co. serving as its chief executive officer.
It wouldn’t be until the younger Bush decided to run for office that Cheney was chosen to lead the Republican candidate’s search for a running mate and, after initially turning down the job, ended up being added to the GOP ticket.
“During the process, I came to the conclusion that the selector was the best person to be selected,” Bush said in the 2020 CNN film “President in Waiting.”
Cheney brought a wealth of knowledge and experience to areas where critics complained Bush was weak. As a former Texas governor, Bush had no elected experience in Washington and little military and foreign policy background compared with Cheney.
Early in Bush’s presidency, Cheney led a task force to develop the administration’s energy policy and sought to keep its records secret in a fight that lasted Bush’s first term and went all the way to the US Supreme Court.
He was, however, at odds with Bush over the issue of same-sex marriage, saying that it should be left to the states to decide. In a 2004 town hall, he noted his daughter Mary’s sexual orientation reportedly for the first time publicly, according to The Washington Post. “With respect to the question of relationships, my general view is that freedom means freedom for everyone. People … ought to be free to enter into any kind of relationship they want to,” he said, the Post reported.
His relationship with Bush was complicated in later years, including by Bush’s refusal to pardon Cheney’s chief of staff Scooter Libby, who had been convicted of perjury and obstruction of justice in 2007 after a probe into who leaked the identity of a CIA operative. Libby was eventually pardoned by Trump in 2018.
In one of the most notorious moments in his personal life, which added to his grizzled legend in 2006, Cheney accidentally shot a hunting partner in the face with birdshot, causing relatively minor wounds.
Cheney’s health issues began in 1978, when he had his first heart attack at age 37 while running for Congress. Three more followed in 1984, 1988 and November 2000, just a few days into the Florida presidential ballot recount that resulted in a Bush-Cheney win.
Cheney at the time said that he’d be the “the first to step down” if he learned he’d be unable to do the job and had a resignation letter in case he was deemed incapacitated. Cheney completed both terms under Bush, attending Barack Obama’s inauguration in January 2009 in a wheelchair.
A year after a fifth heart attack in 2010, Cheney received a heart pump that kept the organ running until his transplant in 2012.
After leaving office, Cheney returned to private life, penning two memoirs — one about his personal and political career and the other about his struggles with heart disease as well as a book with his daughter, Liz. He became one of the most strident GOP critics of President Barack Obama, who had based his election campaign on promises to end the wars and other changes from what he called failed policies of the Bush-Cheney administration.
Years later, Cheney was decrying his own party — especially its leadership’s response to the attack on the Capitol — when he returned to the US Capitol with then-Rep. Liz Cheneyon the one-year anniversary of January 6, 2021.
“I am deeply disappointed at the failure of many members of my party to recognize the grave nature of the January 6 attacks and the ongoing threat to our nation,” he said in a statement.
In a remarkable moment, Democrats lined up to greet the former Republican vice president and shake his hand. Former Democratic House Speaker Nancy Pelosi hugged Cheney. The former vice president slammed Republican leaders in Congress, saying they do not resemble the leaders he remembered from his time in the body.
It was a scene that would have been unthinkable two decades earlier and an illustration of how the extraordinary changes in American politics wrought by Trump had made former bitter political foes find common cause in the fight for democracy.
“It’s not leadership that resembles any of the folks I knew when I was here for 10 years,” Cheney said at the Capitol in 2022.
Cheney continued his criticism of Trump in the following years and went as far as to endorse then-Vice President Kamala Harris, a Democrat and Trump’s opponent in the 2024 presidential campaign. He said he would vote for Harris because of the “duty to put country above partisanship to defend our Constitution.” Cheney emphasized his disdain for Trump at the time and warned that he “can never be trusted with power again,” though Trump would go on to win the presidency a couple of months later.
“We were all blessed to know Mary Rose Oakar — a highly gifted, indefatigable, extraordinary woman of deep faith. Mary Rose was elected to Congress from inside the working class of people. She exhibited raw courage, loyalty, perseverance, high learning, precious humor, and stellar insight into human nature. Her hearty laugh elevated people’s spirits. She suffered no fools,” the statement said. “She not only stood her ground but made her own ground — to serve senior citizens, housing, pay equity, and better health care for women, moving into the ranks of Democratic House leaders where she firmly stood as Vice Chair of the Democratic Caucus.”
According to the statement, Oakar was the first Arab American woman, Syrian-American and Lebanese-American to serve in Congress.
“She dedicated endless hours and years to build new bridges toward peace in the Middle East, and understanding of its complexity for communities here at home,” Kaptur said.
“Mary Rose worked hard to promote an economy that serves everyone, across Northern Ohio, and throughout our nation. Her abilities sparkled as she brought joy, wit, keen insight, kindness, and dynamism to every occasion. I am grateful for her abiding friendship and counsel, which she generously shared. She was one of a kind,” the statement concluded. “Holding all of her family, friends, and her community in Cleveland in prayer. She truly loved them with all her heart and soul.“
In a blow to the entertainment industry that underscores the fragility of Hollywood’s golden era, Robert Redford, the charismatic actor, director, and entrepreneurial force behind the Sundance Film Festival, passed away on September 16, 2025, at his cherished home in the Utah mountains. He was 89.
Redford’s death marks the end of an era for a man whose on-screen magnetism and off-screen business savvy transformed the film landscape, generating billions in box office revenue and fostering an indie film economy that challenged the liberal-dominated studio system.
Cindi Berger, CEO of the publicity firm Rogers & Cowan PMK, confirmed the news in a statement: “Robert Redford passed away on September 16, 2025, at his home at Sundance in the mountains of Utah — the place he loved, surrounded by those he loved. He will be missed greatly.
The family requests privacy.” The announcement comes at a time when Hollywood is grappling with declining ticket sales and cultural shifts, reminding us of Redford’s role as a rare conservative-leaning outlier in an industry often criticized for its left-wing echo chamber.
Born Charles Robert Redford on August 18, 1936, in Santa Monica, California, Redford’s journey from a rebellious youth to a Hollywood powerhouse exemplifies the American Dream of self-made success. After being expelled from the University of Colorado for poor grades and a penchant for mischief, he honed his craft at the American Academy of Dramatic Arts in New York.
His early career blended television appearances on shows like “Perry Mason” and “The Twilight Zone” with Broadway triumphs, including the 1963 hit “Barefoot in the Park” by Neil Simon, which he later adapted to film opposite Jane Fonda.
Redford’s breakthrough came in 1969 with “Butch Cassidy and the Sundance Kid,” co-starring Paul Newman. The Western, which grossed over $100 million (equivalent to nearly $800 million today), became the highest-earning film of the year and was preserved in the National Film Registry in 2003. It launched a string of blockbusters that solidified Redford as a box office juggernaut: “The Sting” (1973), which earned him his only Best Actor Oscar nomination and won seven Academy Awards including Best Picture; “The Way We Were” (1973) with Barbra Streisand, a romantic drama that raked in $50 million despite mixed reviews; and “All the President’s Men” (1976), where he portrayed Washington Post reporter Bob Woodward alongside Dustin Hoffman, exposing the Watergate scandal in a film that garnered eight Oscar nominations.
These hits weren’t just artistic triumphs; they were economic engines. During the 1970s, Redford was Hollywood’s top draw, contributing to films that collectively grossed hundreds of millions and boosted studio profits at a time when the industry was recovering from the decline of the studio system. His collaborations with director Sydney Pollack, spanning seven films including “Three Days of the Condor” (1975) and “Out of Africa” (1985), exemplified efficient, high-return filmmaking. “Out of Africa” alone won seven Oscars and grossed over $227 million worldwide.
Yet Redford’s legacy extends beyond acting into savvy entrepreneurship. In 1969, he founded Wildwood Enterprises, producing films like “Downhill Racer” and “The Candidate” (1972), a satirical take on political ambition that presciently critiqued the Faustian bargains of Washington insiders—resonating today amid ongoing debates about political integrity. His directorial debut, “Ordinary People” (1980), won four Oscars, including Best Director and Best Picture, proving that thoughtful, family-centered dramas could compete commercially against flashier fare.
Perhaps Redford’s most enduring business innovation was the Sundance Institute and Film Festival, established in 1981 in Park City, Utah. What began as a modest filmmakers’ lab evolved into a powerhouse that ignited the independent film boom, launching careers like those of Quentin Tarantino (“Reservoir Dogs”), Steven Soderbergh, and Ryan Coogler (“Fruitvale Station”). Sundance has generated an estimated $100 million annually for Utah’s economy through tourism and production, creating jobs and attracting investment in a red-state haven far from Hollywood’s coastal elite. Critics from the right have praised it as a merit-based platform that democratized filmmaking, countering the big-studio monopolies often accused of pushing progressive agendas.
However, Redford’s outspoken liberalism sometimes clashed with his business acumen. A vocal environmental activist and trustee of the Natural Resources Defense Council, he opposed projects like the Keystone XL pipeline and advocated for Arctic Wildlife Refuge protections—stances that conservatives argue stifled energy independence and economic growth.
His films, such as “Lions for Lambs” (2007) critiquing U.S. involvement in Afghanistan, were seen by some as preachy civics lessons that underperformed at the box office. Still, Redford’s ability to leverage celebrity for causes while maintaining commercial viability highlights a pragmatic streak rare in Tinseltown.
In later years, Redford scaled back acting, with notable roles in “Captain America: The Winter Soldier” (2014) and “Avengers: Endgame” (2019) as the villainous Alexander Pierce—ironic given his anti-establishment roots. His final film, “The Old Man & the Gun” (2018), capped a career that spanned over 50 years. He received honorary Oscars in 2002, the Presidential Medal of Freedom from Barack Obama in 2016, and international accolades like the Légion d’Honneur.
Redford was married twice: first to Lola Van Wagenen (1958-1985), with whom he had four children (two of whom predeceased him), and then to artist Sibylle Szaggars in 2009. He is survived by Szaggars, two children, and grandchildren.
As Hollywood faces streaming disruptions and cultural reckonings, Redford’s death prompts reflection on a time when stars like him drove genuine box office success through talent and innovation, not just ideology. His Sundance legacy endures as a beacon for free-market creativity in film.
LONDON—Terence Stamp, who made his name as an actor in 1960s London and went on to play the arch-villain General Zod in the Hollywood hits “Superman” and “Superman II,” has died aged 87, his family said on Aug. 17.
The Oscar-nominated actor starred in films ranging from Pier Paolo Pasolini’s “Theorem” in 1968 and “A Season in Hell” in 1971, to “The Adventures of Priscilla, Queen of the Desert” in 1994.
The family said in a statement to Reuters that Stamp died on the morning of Aug. 17.
“He leaves behind an extraordinary body of work, both as an actor and as a writer that will continue to touch and inspire people for years to come,” the family said. “We ask for privacy at this sad time.”
Born in London’s East End in 1938, the son of a tugboat stoker, he endured the bombing of the city during World War II before leaving school to work initially in advertising, eventually winning a scholarship to go to drama school.
Famous for his good looks and impeccable style sense, he formed one of Britain’s most glamorous couples with Julie Christie, with whom he starred in “Far From the Madding Crowd” in 1967. He also dated the model Jean Shrimpton and was chosen as a muse by photographer David Bailey.
After failing to land the role of James Bond to succeed Sean Connery, he appeared in Italian films and worked with Federico Fellini in the late 1960s.
He dropped out of the limelight and studied yoga in India before landing his most high-profile role—as General Zod, the megalomaniacal leader of the Kryptonians, in “Superman” in 1978 and its sequel in 1980.
Tristan Rogers, who played legacy character Robert Scorpio on ABC’s “General Hospital,” died Friday, less than one month after he made a special appearance on the soap opera. He was 79.
“The entire ‘General Hospital’ family is heartbroken to hear of Tristan Rogers’ passing,” said Frank Valentini, the show’s executive producer, in a statement. “Tristan has captivated our fans for 45 years and Port Charles will not be the same without him (or Robert Scorpio).”
Born in Melbourne, Australia, Rogers’ first foray into performing was in his early twenties and playing drums in a rock band with a group of friends. They weren’t successful so Rogers turned to commercial work and modeling to earn some money. When the band dissolved, Rogers decided to give acting a try. After various roles in Australia, he also worked as a DJ and eventually moved to Los Angeles to try to break into Hollywood. He said casting directors were initially turned off by his accent but he eventually landed a two-day role on “General Hospital” in 1980.
“I had no idea at the point how big the show was,” Rogers told fellow “General Hospital” actor Maurice Benard on the YouTube show, “State of Mind with Maurice Benard” in 2022.
“I had no name. I was brought in expressly to beat up the hero, Luke, (played by Anthony Geary), and then disappear,” Rogers said. His first day was half-over when then-executive producer Gloria Monty asked if he would like to stay on. They had no character written for him so for three weeks Monty asked him to just appear in scenes “looking furtive, looking suspicious” until they came up with a storyline. It was decided he would play a spy known as “CK8” and eventually he was given the name Robert Scorpio. The character would remain a fixture in Port Charles for the rest of Rogers’ life, even when he wasn’t a current cast member.
Scorpio’s on again/off again romance with Emma Samms’ character, Holly Sutton, remained a favorite among fans. Scorpio also had a romance, and many storylines with another spy, Anna Devane, played by Finola Hughes. Scorpio and Devane shared a daughter, Robin, played by Kimberly McCullough. Samms returned to the show for a stint last fall where it was revealed that Scorpio was the father of her adult daughter, Sasha Gilmore (played by Sofia Mattson.)
Rogers and Samms left the show together in November 2024 in scenes taped with a nod to “Casablanca.” He returned to the show in July for one episode when Sasha arrived to his home in France with her new baby. It was then revealed that Rogers had lung cancer
Rogers’ other acting credits include “The Bold and the Beautiful,” “The Young & the Restless” and “Studio City,” which won him outstanding supporting actor in a digital drama series at the Daytime Emmy Awards. He is survived by his wife, Teresa Parkerson, and a daughter and a son.
Hulk Hogan, the towering, charismatic figure who revolutionized professional wrestling in the 1980s and became the first true household name in the sport, passed away on Thursday at the age of 71. His death, confirmed by longtime partner Eric Bischoff and other sources close to the wrestling legend, was reportedly due to a cardiac arrest. Hogan’s passing marks the end of an era for both wrestling and popular culture, where his influence transcended the ring.
Hogan — born Terry Gene Bollea on August 11, 1953, in Augusta, Georgia — changed the landscape of professional wrestling, helping it become a mainstream entertainment spectacle. In a career that spanned over four decades, Hogan became one of the most recognizable celebrities in the world, known for his larger-than-life persona, trademark yellow trunks, bandana, and his signature move, the leg drop.
A Wrestling Legacy Like No Other
Hogan’s journey to wrestling superstardom began in Florida, where he was first discovered by wrestling scouts while playing in local rock bands and pitching for Little League baseball teams. Trained by Hiro Matsuda and inspired by legends like Dusty Rhodes, Hogan’s early career was marked by several lesser-known ring names, including Super Destroyer and Sterling Golden, before settling on the iconic Hulk Hogan.
Hogan’s WWE debut in the 1980s heralded the beginning of Hulkamania, a cultural phenomenon that spanned beyond the squared circle. He became the face of the WWE, winning the WWE Championship six times and headlining WrestleMania an unprecedented eight times. His most memorable moment came in 1987 when he faced his mentor, Andre the Giant, in a historic match at WrestleMania III, where Hogan body-slammed the 520-pound Giant before a then-record crowd of 93,173 fans in the Pontiac Silverdome.
Hogan’s connection with the audience was unparalleled. He embodied the spirit of the American hero, often invoking his “Real American” entrance theme, flexing his 24-inch pythons, and posing with an American flag to the thunderous cheers of his fans. Hogan’s catchphrases, like “Whatcha gonna do when Hulkamania runs wild on you?” became as famous as his wrestling bouts.
Hollywood and Beyond: The Wrestler Who Became a Pop Culture Icon
Beyond the ring, Hogan’s acting career took off when he starred as Thunderlips in Rocky III (1982), marking his big-screen debut opposite Sylvester Stallone. His larger-than-life personality translated to Hollywood, where he appeared in films like No Holds Barred (1989), Suburban Commando (1991), Mr. Nanny (1993), and Santa With Muscles (1996). He also starred in the syndicated TV series Thunder in Paradise (1994).
Hogan became a fixture in popular culture, appearing in iconic TV shows such as The A-Team, Baywatch, Gremlins 2: The New Batch (1990), and even voicing characters in Robot Chicken and American Dad! He co-hosted Saturday Night Live with Mr. T in 1985, solidifying his place in the mainstream entertainment world.
But it wasn’t just acting that defined Hogan’s legacy. He became a beloved figure, especially for charity work — notably for the Make-a-Wish Foundation, where he was one of the most requested celebrities for children facing life-threatening illnesses.
Hogan’s personal life was as tumultuous as his wrestling career. In 1994, he admitted to using steroids for 13 years, a moment that would mark one of the first of many controversies in his life. Twelve years later, he was embroiled in scandal after a sex tape was leaked, containing racial slurs that led to his removal from the WWE Hall of Fame. However, Hogan made a dramatic comeback in 2016, when he won a $140 million lawsuit against Gawker after the website released the tape. The legal victory sent shockwaves through the media world, leading to Gawker’s bankruptcy and eventual sale to Univision.
Hogan was reinstated into the WWE Hall of Fame in 2018, cementing his status as one of the most influential figures in wrestling history.
In recent years, Hogan stayed active in the wrestling world. In April 2025, he and longtime partner Eric Bischoff launched the Real America Freestyle Wrestling League, securing a TV rights deal with Fox Nation. Despite his age, Hogan remained passionate about promoting wrestling to new generations, never straying far from his roots.
Hogan’s Impact on the Wrestling and Entertainment Industry
The impact of Hulk Hogan’s death reverberates across both the wrestling industry and entertainment. His transformation from a regional wrestler to a global sensation helped propel WWE into the mainstream, and his legendary rivalries with wrestlers like Roddy Piper, Andre the Giant, Ric Flair, and Macho Man Randy Savagebecame the stuff of legend. His heel turn in 1996, as the leader of the New World Order (NWO) in WCW, remains one of the most shocking moments in wrestling history.
Hogan’s influence on professional wrestling is immeasurable — he helped shape the modern spectacle of wrestling, where entertainment and athleticism go hand in hand. His “Hulkamania” became a symbol not only of pro wrestling but of the broader entertainment culture that exploded in the 1980s and 1990s.
Hogan is survived by his wife, Sky, whom he married in 2023, and his two children, Nick and Brooke, from his first marriage to Linda Claridge. He was also married to Jennifer McDaniel from 2009 until their separation in 2021.
For the millions of fans who followed his career, Hulk Hogan was more than a wrestler — he was an icon, an inspiration, and a symbol of perseverance. In his own words, “Hulkamania will live forever.” Now, as the world mourns his passing, it is clear that Hogan’s legacy will continue to endure, immortalized in the hearts of fans and the annals of professional wrestling history.
In 1955, the singer signed a recording contract with MGM Records. (Getty Images)
Francis was further propelled to stardom through hits like “My Happiness,” “Lipstick on Your Collar” and “Among My Souvenirs.” (WireImage)
Francis earned her stripes as one of the most successful female singers in the 1950s and 1960s. (Bettmann Archive)
Iconic singer and New Jersey native Connie Francis, known for hits such as “Pretty Little Baby” and “Everybody’s Somebody’s Fool,” has died at 87.
Francis’ death was confirmed on social media by her friend and copyright manager Ron Roberts Thursday — two weeks after she was hospitalized due to “extreme pain.”
“It is with a heavy heart and extreme sadness that I inform you of the passing of my dear friend Connie Francis last night,” Roberts wrote on Facebook. “I know that Connie would approve that her fans are among the first to learn of this sad news.”
The chart-topping vocalist, who earned her stripes as one of the most successful female singers in the 1950s and 1960s, was rushed to the hospital in Florida July 2.
“I am back in hospital where I have been undergoing tests and checks to determine the cause(s) of the extreme pain I have been experiencing,” Francis wrote.
In a series of posts on July 3 and 4, Francis said she was “feeling much better” during her hospital stay.
The following week, the singer — born Concetta Rosa Maria Franconero — told fans she remained under the watchful eye of doctors and nurses as they determined the cause of her pain.
The “Stupid Cupid” songstress said in May that a hip injury had landed her in a wheelchair.
Despite retiring from the music industry in 2018, Francis’ track “Pretty Little Baby” had recently gone viral on TikTok — over six decades after she released the song as part of her 1962 album “Connie Francis Sings.”
“To tell you the truth, I didn’t even remember the song!” Francis said about the track’s resurgence in popularity. “I had to listen to it to remember.”
“To think that a song I recorded 63 years ago is touching the hearts of millions of people is truly awesome. It is an amazing feeling,” the “Jamboree” actress said. “It’s an honor. To see that they’re paying homage to me is just breathtaking.”
“It’s truly awesome. I never thought it was possible. It’s a dream come true. To think that kindergarten kids now know my name and my music? It’s just thrilling,” she added.
In one of her final social media posts, Francis thanked various celebrities — including the Kardashian-Jenner clan, Timothée Chalamet, Ariana Grande and Taylor Swift — who had listened to her viral track on social media.
“There have been many wonderful artists who have paid tribute to me by singing ‘Pretty Little Baby,’ ” the singer said in a TikTok video shared June 26.
Born in Newark, New Jersey, in 1937, Francis’ love affair with music started at the age of 4 after she took part in various talent contests and pageants in her neighborhood.
She later dipped her toes into TV work, landing a prominent spot on NBC’s “Startime Kids” during which she assumed her stage name, Connie Francis.
Her glittering music career boasts a slew of hit tracks, including Top 10 singles “Who’s Sorry Now?,” “My Heart Has A Mind Of Its Own,” “Where the Boys Are” and “Don’t Break The Heart That Loves You.”
She was the first female singer to reach the No. 1 spot on the Billboard Hot 100 charts with her 1960 song “Everybody’s Somebody’s Fool.”
In 1955, she signed a recording contract with MGM Records, but the partnership proved unsuccessful, as most of Francis’ songs didn’t get traction.
Just as the label was gearing up to drop her in 1957, her father — who had been her biggest fan and supporter — convinced her to record a version of “Who’s Sorry Now?” as a last-ditch attempt to salvage her music career.
Luckily, the singer’s career took great strides in the years that followed, as she was able to rise to stardom through hits like “My Happiness,” “Lipstick on Your Collar” and “Among My Souvenirs.”
What’s more, her 1959 album, “Connie Francis Sings Italian Favorites,” proved a treat with her fans, paving the way for her hit 1960 track “Everybody’s Somebody’s Fool” to top the newly established charts.
As the 1970s arrived, Francis’ music career appeared to wane after she suffered several personal setbacks.
In addition to becoming a rape victim, Francis temporarily lost her voice in 1977 following nasal surgery. On top of that, her brother George was murdered by the Mafia in 1981.
Still, she attempted to channel her hardships through new songs at the time, though these were unsuccessful.
Her mental health took a hit, prompting her father to commit her to multiple psychiatric hospitals.
After surviving a suicide attempt in 1984, Francis released a tell-all memoir titled, “Who’s Sorry Now?”
Following her personal struggles, the musician had partnered with Ronald Reagan’s presidential administration on a task force on violent crime. She was also a voice for rape victims.
Francis further raised awareness of the effects of trauma through her partnership with Mental Health America in 2010.
As for her private life, Francis had dated singer Bobby Darin in the early years of her career — much to her father’s dismay. She considered Darin, who died in 1973 at 37, the love of her life, though her father had kept them apart for reasons unknown.
Shigeo Nagashima, Japan’s most celebrated baseball player and a linchpin of the storied Tokyo Yomiuri Giants dynasty of the 1960s and 1970s, died in a Tokyo hospital on Tuesday. He was 89.
He died of pneumonia, according to a joint statement released by the Giants, the Yomiuri Shimbun newspaper and Nagashima’s management company.
A star from the moment he signed his first professional contract in 1957, Nagashima instantly made a splash with his powerful bat, speed on the basepaths and catlike reflexes as a third baseman. He notched numerous batting titles and Most Valuable Player Awards, and he was a key member of the Giants’ heralded “V-9” teams, which won nine consecutive Japan Series titles from 1965 to 1973.
More than any player of his generation, Nagashima symbolized a country that was feverishly rebuilding after World War II and gaining clout as an economic power. Visiting dignitaries sought his company. His good looks and charisma helped make him an attraction; he was considered Japan’s most eligible bachelor until his wedding in 1965, which was broadcast nationally.
Nagashima signing with the Yomiuri Giants in 1957. (Asahi Shimbun/Getty Images)
The news media tracked Nagashima’s every move. The fact that he played for the Giants, who were owned by the Yomiuri media empire, amplified his exploits. He wore his success and celebrity so comfortably that he became known as “Mr. Giants,” “Mr. Baseball” or, sometimes, simply “Mister.”
“No matter what he did or where he went there was a photo of him — attending a reception for the emperor, or coaching a Little League seminar, or appearing at the premiere of the latest Tom Cruise movie,” Robert Whiting, a longtime chronicler of Japanese baseball, wrote about Nagashima in The Japan Times in 2013. “People joked that he was the real head of state.”
None of that celebrity would have been possible had he not excelled as a ballplayer. Along with his teammate Sadaharu Oh, Japan’s home run king, Nagashima was the centerpiece of the country’s most enduring sports dynasty. He hit 444 home runs, had a lifetime batting average of .305, won six batting titles and five times led the league in runs batted in. He was a five-time most valuable player and was chosen as the league’s top third baseman in each of his 17 seasons. He was inducted into Japan’s Baseball Hall of Fame in 1988.
In his first season, 1958, he led the league in home runs and was second in stolen bases and batting average, earning him rookie of the year honors. And then, early in his second season, he made history in the first game attended by a Japanese emperor, Hirohito, and an empress, Nagako. In the bottom of the ninth inning, Nagashima hit a 2-2 pitch into the left field stands for a game-winning home run, considered one of the most dramatic sports events in Japanese history.
Nagashima hitting a solo home run against the Kokutetsu Swallows in 1959 in Tokyo. (Asahi Shimbun/Getty Images)
One of Nagashima’s trademarks was his work ethic, a character trait that was particularly celebrated during Japan’s postwar rise. Under the guidance of manager Tetsuharu Kawakami, Nagashima practiced from dawn to dusk, enduring an infamous 1,000-fungo drill that required him to field ground ball after ground ball. In the off-season, he trained in the mountains, running and swinging the bat to the point of exhaustion. He bought a house by the Tama River in Tokyo so he could run there, and he added a room to his home where he could practice swinging.
He was often the Giants’ highest-paid player, showered with hefty contracts and bonuses. By the early 1960s, word of his talents had reached the United States. Bill Veeck of the Chicago White Sox tried unsuccessfully to buy Nagashima’s contract, as did Walter O’Malley of the Los Angeles Dodgers, now home to the Japanese superstar Shohei Ohtani. (Ohtani offered his condolences on Instagram, posting photos of himself with the aging icon.)
After ending his playing career in 1974 (his number, 3, was retired), Nagashima became the team’s manager at just 38. He was far less successful in that role, at least initially. He pushed his players — some of whom were his former teammates — to work as hard as he did. “Bashing the players this year cultivates spirit,” Nagashima told The Japan Times.
In his first season, the Giants finished in last place for the first time. The next two years, they won the Central League pennant but lost the Japan Series. The Giants failed to win their division for the next three years, and Nagashima was let go in 1980.
Shigeo Nagashima was born on Feb. 20, 1936, in Sakura, in Chiba prefecture. His father, Toshi, was a municipal worker and his mother, Chiyo, was a homemaker. Nagashima grew up rooting for the Hanshin Tigers, the Giants’ archrival. He took up baseball in elementary school, but because of wartime shortages, he made a ball from marbles and cloth and used a bamboo stick as a bat. After graduating from high school, he entered Rikkyo University, where he started at third base. Rikkyo, typically an also-ran, won three college tournaments.
Nagashima’s wedding to Akiko Nishimura was Japan’s most-watched television broadcast in 1965. (The Asahi Shimbun/Getty Images)
After graduating from Rikkyo, Nagashima signed a then-record 18 million yen (about $50,000 in 1958) contract with the Giants. As his star rose on the field, speculation about his marital status grew. In 1964, he met Akiko Nishimura, a hostess at the Tokyo Olympic Games who had studied in the United States and spoke fluent English, which were considered marks of status and education. Their wedding was the most-watched television broadcast in Japan the following year. She died in 2007.
Their oldest child, Kazushige, played sparingly for the Giants when his father managed the club and now works in television. Nagashima’s second son, Masaoki, is a former racecar driver, and his daughter Mina is a newscaster.
After Nagashima’s first stint as a manager, he worked as a television commentator. His affable style was matched by his occasionally incomprehensible chatter. But his charisma made him an irresistible target when the Giants were looking for a new manager in 1993. Then 56, Nagashima debated whether to return to the dugout.
“My wife and I were looking forward to a quiet life playing golf, and it was hard to decide to throw myself back into the fight,” he told reporters. “But I was raised as a Giant, and if I have the strength, I will do whatever it takes for the Giants.”
Nagashima, then the Giants’ manager, celebrating with his players after they clinched the Central League championship in 2000. (Kyodo News/Associated Press)
Mellowed by age, Nagashima was easier on his players this time around. He also had the good fortune to manage Hideki Matsui, the team’s cleanup hitter and one of the most fearsome sluggers of the 1990s. (Nagashima would later criticize Japanese players, including Matsui, who joined the Yankees in 2004.) The Giants won two Japan Series titles, in 1994 and 2000, during Nagashima’s nine-year tenure. In his 15 years as a manager, his teams won 1,034 games, lost 889 and tied 59 times. The Giants made him a lifetime honorary manager.
As he was preparing to manage the Japanese team at the Olympic Games in Athens in 2004, Nagashima, then 68, suffered a stroke that partly paralyzed the right side of his body. Though he was seen less in public in the years that followed, he was no less adored. In 2013, he and Matsui were given the People’s Honor Award by Prime Minister Shinzo Abe. Eight years later, they were torch bearers at the opening ceremony at the Tokyo Games. Matsui walked slowly, holding Nagashima, as his old teammate, Oh, held the Olympic torch.
Alasdair MacIntyre, a philosopher who metamorphosed from a London Marxist into a Midwestern American Catholic during a decades-long quest to prove there was an objective foundation to moral virtue — a lonely project that struck many of his academic peers as anachronistic yet drew a large, varied and growing crowd of admirers — died on May 21 in South Bend, Ind. He was 96.
Moral beliefs are widely considered matters of private conscience — up for debate, of course, but not resolvable in any sort of final consensus. That is why, for example, people generally think teachers should guide students toward self-realization, rather than proselytize their own beliefs. The same neutrality is expected of lawyers, therapists, government officials and others.
Mr. MacIntyre belonged to a different moral universe.
In his best-known book, “After Virtue” (1981), he argued that thousands of years ago, the earliest Western philosophers and the Homeric myths generated “the tradition of the virtues,” which was treated as objective truth. Value neutrality, to Mr. MacIntyre, was the goal of “barbarians” and a sign of “the new dark ages which are already upon us.”
Such language might make Mr. MacIntyre seem like a wistful reactionary. In fact, his worldview was far less predictable.
He never entirely disavowed his youthful Marxism, applauding Marx’s critique of the individualistic and acquisitive spirit of capitalism. He maintained a certain sort of modesty from his days as a self-appointed champion of the working class — he never earned a Ph.D. and disliked being called “professor” — and he continued showing the dialectical passion of a Trotskyist, occasionally launching into what one colleague called “MacIntyrades.”
His chief opponent was what he called “modern liberal individualism,” a category in which he included not just supporters of the Democratic Party but also conventional conservatives, leftists and even anarchists. All were guilty of “emotivism”: the belief that humanity was essentially a collection of autonomous individuals who selected their own principles based on inner thoughts or feelings.
This starting point, Mr. MacIntyre argued, could lead only to eternal, unresolvable disagreement. He went so far as to suggest that every tradition of modern politics had come to “exhaustion,” and he rejected many essential tools of modern moral philosophy: Thomas Hobbes’s social contract, John Locke’s natural rights, Jeremy Bentham’s moral consequences and Isaiah Berlin’s pluralism.
In his best-known book, Mr. MacIntyre argued that the earliest Western philosophers and the Homeric myths generated “the tradition of the virtues,” which was treated as objective truth. (University of Notre Dame Press)
Instead, he valued storytelling, tradition and rational debate, embedded within a shared moral community. He found these qualities in the thinking of Aristotle and Aquinas, who promoted “a cosmic order which dictates the place of each virtue in a total harmonious scheme of human life,” he wrote in “After Virtue.” Within such an order, moral truth was objective.
“After Virtue” gained extraordinary popularity for a work of late-20th-century moral theory, selling more than 100,000 copies, Compact magazine wrote in a piece published after Mr. MacIntyre’s death, titled “Postliberalism’s Reluctant Godfather.”
That was an apt label for someone who managed, in recent years, to earn multiple tributes fromJacobin, a journal on the socialist left, and FirstThings, which is on the religious right. Mr. MacIntyre seemed to grow increasingly uncomfortable with his influence as it came unavoidably into focus.
In “After Virtue,” he wrote that morality arose out of a belief in human telos — the ancient Greek notion of purpose being intrinsic to existence. People of the modern world, he said, had two choices: Follow Nietzsche in trying to honestly face a world without the traditional notion of a human telos, rendering moral thought baseless, or follow Aristotle and recover moral purpose by fostering a society dedicated to the cultivation of virtue.
Mr. MacIntyre illustrated what that might look like with an analysis of what he called “practices” — shared, skillful activities including chess, architecture and musicianship — as examples of where virtue still had meaning. These pursuits, he said, intrinsically provide “standards of excellence” and reward traits like justice, courage and honesty. In them, he saw a possible modern basis for virtue.
“After Virtue” was acclaimed by leading philosophers, including Bernard Williams, who in a 1981 review for The Sunday Times of London wrote that even Mr. MacIntyre’s exaggerations were “illuminating”; that his intellectual history of the moral self was a “nostalgic fantasy” and yet also “brilliant”; and that, whatever questions the book raised, “the feeling is sustained that one’s question would get an interesting answer.”
In a subsequent book, “Whose Justice? Which Rationality?” (1988), Mr. MacIntyre provoked sharper criticism. His argument now promoted Roman Catholicism with Aquinas, not Aristotle, as its paragon of moral thought.
Mr. MacIntyre in 2022 at a conference at the University of Notre Dame, where he was a professor emeritus of philosophy. (Peter Ringenberg/University of Notre Dame)
The philosopher Martha Nussbaum wrote a memorable takedown in The New York Review of Books accusing Mr. MacIntyre of dropping some of his own principles — such as his devotion to local traditions — when discussing Aristotle, Augustine and the pope. What really interested Mr. MacIntyre, she argued, was not reason but authority: the ability of the Catholic Church to secure wide agreement, and, by extension, order.
She was one of several distinguished thinkers to challenge Mr. MacIntyre’s idealized view of the past, arguing that historical societies were not as unified as he claimed and that unanimity itself was not so great.
In a review of “Whose Justice? Which Rationality?” published in The Times Literary Supplement, Thomas Nagel wrote, “MacIntyre professes to be freeing us from blindness, but he is really asking for the return of a blindness to the difficulty of moral thought that it has been one of the great achievements of ethical theory to escape.”
Alasdair Chalmers MacIntyre was born on Jan. 12, 1929, in Glasgow. His parents, John and Emily (Chalmers) MacIntyre, were both doctors. In the 1930s the family moved to London, where his parents treated patients in the working-class East End neighborhood.
In 1949, he earned a bachelor’s degree in classics from Queen Mary College at the University of London. In the 1950s and ’60s, he earned master’s degrees in philosophy from Manchester University and Oxford while holding several lectureships.
As a student, he joined the Communist Party, but he also steered debates of Britain’s Student Christian Movement as its chairman.
In 1970 he moved to the United States, where he taught at Brandeis University and gradually left Marx for Aristotle. In the 1980s, he converted to Catholicism and took to seeing Aquinas as the master thinker of the Aristotelian tradition. He had a series of academic appointments but mostly taught at Notre Dame, where his wife, Lynn Joy, is also a philosophy professor.
Mr. MacIntyre had recently found a following among the Trump-supporting, religious, anti-consumerist and illiberal right. (Matt Cashore/University of Notre Dame)
His two previous marriages ended in divorce. In addition to Ms. Joy, he is survived by a sister, Joyce McCracken; two daughters, Jean and Helen MacIntyre; a son, Daniel; and four grandchildren. He and his wife lived in Mishawaka, Ind., a city near Notre Dame.
For decades, no single tendency seemed to define readers who took inspiration from Mr. MacIntyre’s work. There were heterodox Marxists, the skeptic of liberalism Christopher Lasch and the former Republican presidential candidate Rick Santorum.
But more recently, one constituency claimed Mr. MacIntyre’s work most completely and prominently: the Trump-supporting, religious, anti-consumerist and illiberal right. Two leading commentators of this world, Patrick Deneen and Rod Dreher, have written books that pay tribute to Mr. MacIntyre.
In 2017, the publication of one of these books, Mr. Dreher’s “The Benedict Option,” prompted an odd debate between Mr. Dreher and Mr. MacIntyre, with each manaccusing the other of commenting on a book of his that he had not actually read.
During a lecture at Notre Dame, Mr. MacIntyre deplored becoming part of an ideological battle of his own time.
“The moment you think of yourself as a liberal or a conservative,” he said, “you’re done for.”
Cookie Consent
We use cookies to improve your experience on our site. By using our site, you consent to cookies.
Contains information related to marketing campaigns of the user. These are shared with Google AdWords / Google Ads when the Google Ads and Google Analytics accounts are linked together.
90 days
__utma
ID used to identify users and sessions
2 years after last activity
__utmt
Used to monitor number of Google Analytics server requests
10 minutes
__utmb
Used to distinguish new sessions and visits. This cookie is set when the GA.js javascript library is loaded and there is no existing __utmb cookie. The cookie is updated every time data is sent to the Google Analytics server.
30 minutes after last activity
__utmc
Used only with old Urchin versions of Google Analytics and not with GA.js. Was used to distinguish between new sessions and visits at the end of a session.
End of session (browser)
__utmz
Contains information about the traffic source or campaign that directed user to the website. The cookie is set when the GA.js javascript is loaded and updated when data is sent to the Google Anaytics server
6 months after last activity
__utmv
Contains custom information set by the web developer via the _setCustomVar method in Google Analytics. This cookie is updated every time new data is sent to the Google Analytics server.
2 years after last activity
__utmx
Used to determine whether a user is included in an A / B or Multivariate test.
18 months
_ga
ID used to identify users
2 years
_gali
Used by Google Analytics to determine which links on a page are being clicked
30 seconds
_ga_
ID used to identify users
2 years
_gid
ID used to identify users for 24 hours after last activity
24 hours
_gat
Used to monitor number of Google Analytics server requests when using Google Tag Manager