Skip directly to search Skip directly to A to Z list Skip directly to navigation Skip directly to site content Skip directly to page options

Oak Ridge Reservation

Oak Ridge Reservation: Public Health Assessment Work Group

Historical Document

This Web site is provided by the Agency for Toxic Substances and Disease Registry (ATSDR) ONLY as an historical reference for the public health community. It is no longer being maintained and the data it contains may no longer be current and/or accurate.

Public Health Assessment Work Group

January 3, 2003 - Meeting Minutes


Attendance

ORRHES Members attending:
Bob Craig (Work Group Chair), Kowetha Davidson (ORRHES Chair), W. Don Box, George Gartseff, David Johnson, James Lewis, Tony Malinauskas, Pete Malmquist, LC Manley, and Charles Washington

Public Members attending:
Gordon Blaylock, Tim Joseph, and Roger Macklin

ATSDR Staff attending:
Jack Hanley (phone), Bill Murray, Lorine Spencer (phone), and Dee Williamson

ERG Contractor:
Liz Munsen (phone)

Agenda

  1. Introductions – Bob Craig
  2. Minutes from December 16, 2002, meeting - Bob Craig
  3. Health symptom and disease studies – Dee Williamson, ATSDR
  4. Request for data to TN Cancer Incidence Registry – Bob Craig/George Gartseff
  5. New business – Bob Crai

Purpose: Bob Craig called the PHAWG meeting to order and attendance was noted for the record.

Minutes from the December 16, 2002, Meeting
Bob Craig asked for a motion to approve the minutes. W. Don Box motioned to approve the minutes and Pete Malmquist seconded the motion. The December 16, 2002, minutes were unanimously approved.

Health Symptom and Disease Studies

Presenter: Dee Williamson, ATSDR

Agenda

The purpose of Dee Williamson’s presentation was to provide information that would assist the PHAWG decide whether or not to use cancer incidence registry data. Ms. Williamson presented (1) detailed information on descriptive and analytic epidemiologic studies and (2) aspects of a health outcomes data evaluation.

Descriptive Epidemiologic Studies

Dee Williamson detailed three types of descriptive studies: cancer incidence, cancer mortality, and cancer symptom and disease prevalence. Ms. Williamson noted that these studies describe what is “going on” in a particular area, but because they are descriptive analyses, they do not show cause and effect.

Cancer Incidence Analysis

Cancer incidence analysis studies use registry data, which is an existing source of information. When a doctor reports a new case of cancer for someone living in a particular area, this is reported into the cancer incidence registry. Dee Williamson believed that the Tennessee cancer incidence registry began in the 1980s, and that the information used in this registry is from the early 1990s. Also, she stated that in these studies, no environmental sampling would be collected from the area nor any biological samples taken from individuals.

Cancer Mortality Analysis

Cancer mortality analysis studies also look at existing sources of data; however, these studies rely on information from death certificates. Since these studies look at vital statistics records, they can date back further than other types of analyses. These studies have the following limitations: information on other risk factors is not usually available (i.e., data on family predisposition toward a disease is unknown) and discrepancies with the cause of death could occur (i.e., person had cancer, but died in a car crash).

Symptom and Disease Prevalence Study

The symptom and disease prevalence study looks at people who may have been exposed to a contaminant and compares them to a different population that was not exposed. This study looks at the frequency of different symptoms and diseases that individuals may have, and assesses if these occur more or less often in the exposed community when compared to the unexposed community. These studies are limited in that they normally rely on self-reported data. However, a clinical specimen (e.g., blood sample) is sometimes available.

Analytic Epidemiologic Studies

Dee Williamson outlined two types of analytic epidemiologic studies: case control and cohort. Ms. Williamson stated that these two studies are used often in epidemiological research.

Case Control Study

Case control studies look at exposures and diseases that people have. Dee Williamson used lung cancer as an example. She said that you could take people who have lung cancer and those who do not, and look for differences between the two groups to identify risk factors. She explained that these types of studies were used to discover that smoking was an increased risk for lung cancer.

A case control study attempts to answer the question, “Is there an association between a disease and an exposure?” Ms. Williamson explained that exposure could consist of a number of different elements, including environment, smoking, diet, and other factors. In addition, this type of study usually looks at things that happened in the past.

Dee Williamson stated that a case control study is a “good study” because it deals with a defined population - people who are affected with a disease. Unlike the descriptive studies, a case control study can examine cause and effect. She said that hopefully over time, this study can identify risk factors that caused a disease. However, she stated that one study alone will not show the risk factors.

Cohort Study

The enrollment of people in a cohort study is based on the presence or absence of an exposure (i.e., workers, residents, etc., whose exposure can be documented). A cohort study looks at this group of individuals and its potential health outcomes. The cohort is followed over time to see if people are exposed and to see what, if any, diseases progress. According to Ms. Williamson, this study is based on exposure, whereas a case control study is usually based on disease.

Health Outcomes Data Evaluation

This evaluation looks at cancer, birth defects, and any health-related outcome where there is information on different diseases. Dee Williamson presented a diagram to the PHAWG that demonstrated the health outcomes data evaluation process. The diagram contained several questions and had a “yes side” and a “no side”; one side showed the steps to take if certain questions were answered by “yes” and the other side was followed if specific questions were answered by “no.” The questions are listed below:

“Yes” Side of Diagram

  1. Is there is a completed exposure pathway?
  2. Is there a determined exposure extent and duration?
  3. Can the exposed population be defined?
  4. Were there sufficient exposure levels and latency? (i.e., If it was reported that contaminant Y caused a type of cancer within a 2-year time period; however, this kind of cancer typically takes 20-30 years to develop, then the answer would be no because it was not a long enough latency period.)
  5. Are the geographical units similar between the population exposed and the health outcome data?
  6. Is the health outcome biologically plausible?

Jack Hanley suggested that Dee Williamson use iodine as an example in the health outcomes data evaluation diagram. She asked Mr. Hanley the questions and he responded:

1) Is there is a completed exposure pathway?
Answer: Yes, likely.

2) Is there a determined exposure extent and duration?
Answer: There are specified years for duration. The problem with extent is that there is a lot of uncertainty in the Dose Reconstruction Feasibility Study. The range varies from 1 millirem per year (mrem/y) to 200 mrem/y, thus the exposure extent is unknown.

3) Can the exposed population be defined?
Answer: This population varies because the information in the Dose Reconstruction Feasibility Study is variable. Mr. Hanley does know that some populations were exposed, but he is not sure about people who live further away from ORR.

4) Were there sufficient exposure levels and latency?
Answer: The analysis indicates that people were exposed to either very high or very low levels. In regards to latency, the last exposure would have occurred in 1956 which is over 46 years ago. Mr. Hanley stated that 40 years after an exposure, a person is unlikely to have cancer that is related to an exposure.

5) Are the geographical units similar between the population exposed and the health outcome data?
Answer: The geographic areas are not clearly defined at this point. Dee Williamson added that “the cancer registry data are related to the census tract.” She stated that if you could get the registry data for a specific area rather than to a census tract, then you could compare the information.

6) Is the health outcome biologically plausible?
Answer: The only plausible outcome is thyroid cancer with relation to iodine.

“No” Side of Diagram

1) Is there a completed exposure pathway?
Dee Williamson stated that this often occurs with different sites. There may have been exposures on site, but the exposures are not going off site. However, the community is concerned and therefore, you can answer yes to the next question.

2) Are there community health concerns?
Dee Williamson stated that ATSDR conducts many health statistic reviews because of community health concerns. She provided a study that she conducted in Memphis as an example. There was no off-base contamination or exposure, but the community was concerned about cancers in their area. Thus, a cancer review was conducted.

3) Is there a potential exposure pathway?
If there is neither a potential exposure pathway nor any health outcome data, Ms. Williamson indicated that there are times when a health evaluation could still be done. She added that if there is no exposure pathway, then there is no way to show that people are being exposed to something that is causing a certain health effect. She said this could mean that something else is going on in a community that is not related to the site.

4) Are the geographical units similar?
Ms. Williamson stated that the main purpose of this question is to look at data available at a county level, and then assess the rates of that disease in a smaller area. She stated that you will be able to compare similar geographic units.

Conclusion

Dee Williamson told the PHAWG that they needed to consider three questions. First, what do you want answered? Second, what data are available? And third, what questions can the data answer? She stated that the work group needed to be clear on what the data can tell them. She said that if they were looking at exposures from 1950 and people who developed thyroid cancer, then the registry will not be able to answer their question.

Discussion

Pete Malmquist asked if cancer cases would be reported in either the cancer incidence or mortality reports. He used the following question as an example. If there were 15 cases of cancer in Roane County, would these be recorded somewhere in the registry? Ms. Williamson responded that if a person lived in an area where s/he was diagnosed, then the person would be included in the registry. However, she stated that if someone lived in an area for 30 years and then moved, the person would be counted in the new area.

Tony Malinauskas asked if it is impossible to know the cause in a symptom and disease prevalence study. Ms. Williamson said that Mr. Malinauskas was correct and that you were not able to decipher the cause in these studies. Kowetha Davidson added that the cause could only be determined in these types of studies if a disease had a single cause, which is very rare.

Charles Washington commented that many people in the area may think that cancer death would not be listed. He said it has been suggested that residents in the community believe that cancer would not be listed because it may be related to their work, and consequently could implicate facilities in the area. He continued by stating that he knows some doctors have “acted against” the Oak Ridge area residents because some of the local facilities caused some of the diseases. He suggested that medical records be reviewed for accuracy in documenting cases.

James Lewis mentioned a particular concern about how to deal with anecdotal information. He read the following community concern to the PHAWG, “Over 80% of people die from cancer; grandfather has spot on lung; husband passed of leukemia; cancer from the plant or the water; husband died of cancer in 1996, worked 39 years at ORR: Everybody around here dies with cancer; Did living here have anything to do with it? Cancer killed 2 brothers, mother, and husband; high rate of breast cancer; cancer possibly due to vegetable garden.” Mr. Lewis was concerned that some members of the public reading this type of anecdotal information could foster this type of thinking in an area, even without any supporting data.

Dee Williamson responded to Mr. Lewis by saying that her first reaction in this situation is to believe the person. She stated that she would first ask about the type of cancer. If the same type of cancer already has been reported, she stated that she would further investigate to see what is happening in the area. In addition, she would look at the age of the people who were affected. She said you would see more cases in an older population. Also, if the cases are in the same family, she would investigate if the cancer was hereditary. Furthermore, Ms. Williamson said that she was not sure if a health statistics review would answer the necessary questions. She added that when looking at a small population, it is difficult to get a valid answer; a larger population would be needed.

Kowetha Davidson commented that it is not unusual for a lot of people to die from cancer. She added that one would expect to see a lot of cancers in a population because it is probably the major cause of death. Charles Washington stated that he believed that more people die from heart attacks. However, Dr. Davidson said that you need to look at all different types of cancer.

Pete Malmquist asked a question related to the number of cancers. He wanted to know if it was possible to differentiate between exposure at work and exposure outside of work. He reminded the PHAWG that they are supposed to focus on non-worker exposure. Dee Williamson responded that when looking at cancer mortality, cancers are compiled by each county with no differentiation for work or non-work exposures. Bob Craig added that to his knowledge, records are grouped by county and not by worker and non-worker exposures. Kowetha Davidson stated that types of data are usually not differentiated except when dealing with studies concerning worker populations.

Jack Hanley spoke in reference to the three studies that Dee Williamson had outlined. He stated that these studies only look at health outcome data. With these studies, there is no way to differentiate between on site, off site, when the person was exposed, and what the person was exposed to. He said that these studies are descriptive and do not look at exposure data.

James Lewis asked how you can separate data that are a combination of a worker on the job and outside of the job. He said that many people do not separate these two in their minds and that a constant in many people’s minds is...“There is a contaminant over there that I believe is related to cancer.”

Dee Williamson reiterated that you cannot separate these types of studies. She reviewed that cancer incidence deals with where people were diagnosed and does not take a person’s job into account. Cancer mortality looks at where people were living when they died and their cause of death. Also, she said that symptom and disease prevalence could probably ask questions about occupation and exposure. However, Jack Hanley mentioned that this would be self-reported data and thus difficult to use and apply.

In regard to case control studies, Kowetha Davidson asked what would be a reasonable number of cases and controls to determine if there is a cause and effect. Dee Williamson responded that no set number exists, but that the biggest concern is the number of cases because those are the people who have the disease. She said that the more people you have in a study, the better the results. In addition, Ms. Williamson mentioned that a couple hundred cases and controls would probably be best, but that she could not give a specific number.

Tim Joseph did not think that Kowetha Davidson’s question had been answered. Mr. Joseph told the PHAWG that he did not think the number of cases would ever show cause and effect in a case control study. Dee Williamson said that over time, these studies are replicated and will show the same results. Mr. Joseph then asked Ms. Williamson if exposure data were still needed and she responded that it was needed. She continued by saying that the study will have a certain number of cases and within the cases, there will be a certain number with different types of exposure. Dr. Davidson stated that she had been assuming exposure for a case control study.

Charles Washington stated that a de minimus has to be set in order to achieve valid results. He said that more people would produce a better and more valid study. Dee Williamson agreed and stated that a study has to have enough people in order to show valid results. She stated that there are statistical methods used to indicate how many people are needed to yield valid results.

James Lewis asked if the results from case control studies are ever shared. Bill Murray responded by saying that he used to conduct worker studies and provided an example of a case control study that was related to leukemia. He said that to conduct these studies, he would go to one or more plant, pick out workers who died from leukemia, obtain controls who matched the deceased workers, and conduct an analysis to see if people who died from leukemia were more likely to have been exposed. Mr. Murray stated that using a case control methodology is one of the best ways to conduct a study on workers. Jack Hanley added that all of the worker studies conducted by the National Institute for Occupational Safety and Health (NIOSH) and the Department of Energy (DOE) at the ORR are summarized in the Compendium of Public Health Activities at the Oak Ridge Reservation.

Following Bill Murray’s comments, Charles Washington stated that he did not think that personal information, by name, could be obtained on individuals from these various facilities. Bill Murray stated that under the Occupational Safety and Health Act (OSHA), NIOSH has permission to obtain individual exposure information and personal data from any plant in the United States. He also stated that, even if the records do not show the quantity of exposure, methods are used to estimate exposures.

Bob Craig stated that he is familiar with more recent worker studies related to the litigation at the Paducah Gaseous Diffusion Plant (PGDP), in Paducah, Kentucky. He said that in this case, worker names were taken out and assigned a number for identification purposes, but the complete history of the worker was obtained. Bill Murray stated that NIOSH uses workers’ names to identify causes of death.

James Lewis discussed a group of workers who have been receiving compensation because they have diseases and cancer resulting from working in plants. He asked the PHAWG if this financial payment was based on information obtained from case studies. Bill Murray responded that these people are being compensated under the Energy Employees Occupational Injury Compensation Program Act. He added that information on what cancers are associated with certain types of exposure is obtained from studies of worker cohorts. Kowetha Davidson continued by saying that for this compensation system to work, a conclusion that a particular exposure is related to a disease has to be drawn.

Pete Malmquist commented that since the PHAWG is dealing with a non-worker population, it seems almost impossible to conduct a cohort study. He continued that there are no records for the non-workers and exposure cannot be proven. In addition, Mr. Malmquist asked how to look at long-term exposure for people who live in different places. Bob Craig responded to Mr. Malmquist by stating that the public health assessment (PHA) should answer these questions because it looks at the contaminant of concern (COC), determines what populations could have been exposed, and determines at what exposure levels. Mr. Craig said that following the PHA, an epidemiological study would be the next step.

Jack Hanley agreed with Bob Craig and stated that the PHA identifies exposure. After the PHA, Mr. Hanley stated that the best follow-up public health activity could be education, a study, or another activity. In addition, he said that analytical epidemiological studies could be conducted if exposure data were available. He added that if this type of study was important to the community or if it would be a scientific benefit, then the studies could be done presuming exposure was known.

Jack Hanley told the PHAWG that the Dose Reconstruction Feasibility Study was conducted by the state of Tennessee to see if past doses could be reconstructed, and thus examine past exposures. Mr. Hanley explained that the state’s panel reviewed the results of the reconstruction, and it was determined that any further increases were unlikely, except for iodine. As a result of these findings, the panel did not recommend continuing with an epidemiological study. Bill Murray read this recommendation from the panel to the PHAWG, ... “formal epidemiological studies of populations exposed to iodine 131, mercury, PCBs, and radionuclides from White Oak Creek are unlikely to be successful and should not be performed at this time.”

Jack Hanley elaborated on the epidemiological studies. He stated that the descriptive studies are used to give a general occurrence of cancer in a community. These studies cannot relate a disease to an exposure (i.e., cancer to the reservation at Oak Ridge). However, descriptive studies are helpful for public health because they give a pattern of disease, tell what diseases people are getting, if diseases increase or decrease over time, and where diseases are occurring.

Dee Williamson asked Jack Hanley about the time it takes thyroid cancer to develop. Mr. Hanley responded that thyroid cancer takes a minimum of four to five years to develop in children who were exposed. He also stated that it was unlikely to find cancers related to an exposure after 40 years. Kowetha Davidson commented that after this time period (40 years), it is a person’s age that would be associated with an increase in thyroid cancer instead of an exposure.

Jack Hanley asked Dee Williamson about the differences between descriptive and analytical epidemiology studies. He asked if the main difference is that one deals with general population data (descriptive) and the other (case control or cohort) deals with individuals and individual exposure. Ms. Williamson responded that it is a significant difference with cancer incidence, cancer mortality, and health statistics reviews. However, it is not with the symptom and disease prevalence study because these types of studies rely on individuals. In addition, although the information is self-reported in symptom and disease prevalence studies, case control studies utilize documented disease data and cohort studies use documented exposure data. Mr. Hanley added that more defined exposures will yield a better study.

James Lewis asked Dee Williamson how she handles cases where there is not a completed pathway, but there are elevated levels of a particular disease (i.e., there is an excess of thyroid cancer in an area, but there is no identified exposure from the site). Tony Malinauskas stated that this is a case where the geographical unit is put before the potential exposure pathway. Ms. Williamson responded to Mr. Lewis’s question by stating that there would be many follow-up activities for this type of situation. First, an analysis could look at how long individuals have lived in the specific area. Second, the researcher could look to see if other risk factors, other than iodine, are associated with thyroid cancer.

Charles Washington said that the PHAWG needs to take into consideration that the Oak Ridge Reservation refers to the facilities, but also to the people living there. He stated that they are discussing actions that took place before the Clean Air Act and that workers could have brought contaminants home with them. He said that at that time, restrictions were not put on items in the laboratory. Mr. Washington said that chemicals were poured down the drain at these facilities.

Pete Malmquist asked Dee Williamson about geographic units. He wanted to know if it was better to study larger areas with larger populations. He was concerned that if a smaller population was studied, then they would have to look at every small area in each county. Ms. Williamson responded that it should be done by census tract because this is the lowest level of data available. She stated that the study she conducted in Memphis used six defined census tracts with small populations and that this tended to cause problems. Mr. Malmquist asked about looking within a county that has a higher incidence of a disease, and then looking closer at the smaller population. Ms. Williamson indicated that this would be extremely arduous and Mr. Washington said it would depend on the state of the chemical (solid, liquid, or gas).

Jack Hanley responded to Mr. Malmquist’s question. Mr. Hanley stated that data on cancer incidence are available by census tract. He continued by saying that there are difficulties because the number of people in census tracts in rural areas are relatively small. He said that if there were one or two cases, this can cause problems because small numbers can distort rates dramatically. Dee Williamson reiterated this point by saying that when you look at a smaller level, it throws the results off. Another question raised by Mr. Malmquist was if there is an increased incidence of cancer in a county, could you return to the census tract to examine if a cluster exists in a specific area. Ms. Williamson stated that even if there is an excess of a disease at a county level, an elevation may not exist at the census level.

Tony Malinauskas said that the PHAWG needs to address anecdotal information. He stated that they need to determine whether or not the data are factual.

Kowetha Davidson reminded the PHAWG that when they discuss cancer, they are looking at various diseases. She stated that all cancers are separate diseases (e.g., lung cancer, breast cancer, thyroid cancer, liver cancer) and that the PHAWG needs to break the cancers down into particular cancers. She continued by saying that cancer is caused by more than chemicals and radiation and that the PHAWG needs to know the association of an exposure with a particular population. A meeting participant agreed with Dr. Davidson and said that the group needs to look at the numbers of cases and eliminate cancers that are not elevated or of concern. He also recommended that a flow chart be developed for each cancer and pathway.

Charles Washington asked if these types of data (e.g., different kinds of cancer) are kept in the registry. Jack Hanley suggested that Dee Williamson show the tables from the Memphis study to the PHAWG to illustrate what another study had investigated.

Ms. Williamson presented the Memphis tables to the PHAWG. The study looked at six census tracts. Men and women were analyzed separately and 23 types of cancer were examined for each gender. Then, the number of people who had cancer were determined and compared to national rates to assess how many would be expected to occur in that area. The number of cases that was seen was then compared to the number that was expected to see if there was an elevation. She stated that over the six- to seven-year period that was considered, no women with bone cancer were reported and some types of cancer only had one reported case. She said that conclusions cannot be drawn by looking at one case.

Ms. Williamson stated that the cancer registry can be used to look at all types of cancer. However, she suggested breaking down cancers by specific type. She said that in a larger area, this would be a better way to see if an elevation exists. Tony Malinauskas asked if certain types of cancer could be excluded. Ms. Williamson responded that the PHAWG could choose whatever they want to examine. However, she indicated that age should be taken into account because different cancers are related to certain age groups. She also recommended using adjusted, rather than unadjusted, rates.

James Lewis asked about an article that was printed in The Tennessean. According to Mr. Lewis, the article documented interviews where residents identified their various symptoms. Mr. Lewis stated that many people will assume this paper is accurate and he wanted to know how to deal with this type of situation. Ms. Williamson stated that she has dealt with many communities who have conducted their own symptom surveys. She said when ATSDR goes into a community, it may decide to conduct a symptom and disease prevalence study, which entails looking at a defined area that has been exposed and comparing it to an unexposed population. She added that the problem with community survey data is that the members participating in the study are usually ill and already have various symptoms.

Jack Hanley provided a follow-up to Mr. Lewis’s question. He stated that after the newspaper article was released, Senator Bill Frist had asked the Secretary of the Department of Health and Human Service, Donna Shalala, to investigate the article. Secretary Shalala’s office determined that the article contained anecdotal information and that problems existed with the geographic area. Secretary Shalala sent a letter to Senator Frist in 1996 or 1997 stating that the data in the article were not usable for any type of valid epidemiological study.

Mr. Lewis responded by stating that this letter was not publicized as highly as the article. He was wondering how to change peoples’ concepts that the article was factual information. Ms. Williamson referred to the article and used an example where one woman reported a symptom of hair loss. She stated that if this is a concern in the area, there are no reported data to compare this to. She said that “is the health outcome data available?” refers to if there is collected information on the number of people reporting this symptom, which is needed to compare the symptom. For instance, if people were worried about cancer, the cancer registry could be utilized.

James Lewis asked what type of study can be conducted if no comparison population exists. Ms. Williamson responded that it depends on the particular concern. She gave an example of asthma, stating that there are different risk factors that can affect this illness. She said that you could try to educate the community or use other methods to help.

Kowetha Davidson also commented on the article. She stated that all types of bias are related to how this type of information was collected. Thus, it is necessary to know how the information was collected.

Dee Williamson added to the article discussion. She said that these types of findings are what “makes the paper” and that this type of information does not look at how many people around the site do not have those symptoms. She detailed how the other types of studies discussed at this meeting sample people on an already defined mechanism. In addition, participants for these studies are selected randomly, but in an organized manner.

Kowetha Davidson brought up a point regarding clusters of diseases. She stated that it is easier to start looking at a particular disease if it is unusual and in a defined area. She provided an example of a study that was conducted because children were developing particular diseases. The study determined that a housing community was built on a waste dump and that very high exposures were occurring through the soil. Dr. Davidson pointed out that it is more difficult to associate very low exposure levels with disease outcome.

Request for Data to Tennessee Cancer Incidence Registry

Presenters: Bob Craig and George Gartseff, ORRHES

Bob Craig began by stating that George Gartseff had the task of drafting an executive decision with the sub workgroup committee. Bill Murray stated that the members of this committee were: Pete Malmquist (Chair), Tony Malinauskas, Charles Washington, George Gartseff, Tim Joseph, Bob Craig, and James Lewis. As the PHAWG had not received copies of this decision, Bob Craig suggested that copies be handed out and that the PHAWG review the document for discussion at their next meeting. It was also suggested that James Lewis prepare a flow chart that includes different types of cancers and additional information; this flow chart will be included in the recommendation.

Details of the Guidance for ATSDR Health Studies (1996)

Presenter: James Lewis, ORRHES

James Lewis discussed different aspects of the Guidance for ATSDR Health Studies. He stated that the document summarizes many things that the PHAWG has heard and that it establishes guidance and policy on how to conduct a study. Mr. Lewis discussed a Health Statistics Review that looks at existing data and falls under a Type-1 Health Study as defined in this guidance manual. Dee Williamson confirmed that the PHAWG would be conducting this type of study, which Mr. Lewis noted is often a more exploratory type of effort than a Type-2 Health Study.

Mr. Lewis also commented on the section in the manual that discusses “when not to do health studies” and “when to do health studies.” He referred to the last bullet under the “when to do a health study” section that lists criteria used to determine if a health study should be conducted. Mr. Lewis read the following item: “documented excess of an adverse health outcome, when known.” He said that as you continue to read the document, it discusses certain limitations and aspects that people need to have to be considered for a health study. Mr. Lewis questioned what ATSDR does with the information if it does not meet the criteria. He stated that it appears as though, ... “If you don’t find an exposure, you don’t plan to do a whole lot.” He suggested that the PHAWG read and study the document, and if they have any problems with ATSDR’s approach, they need to note it. If there is not a problem with the approach, then Mr. Lewis thinks that the PHAWG should consider endorsing the document and include a flow chart.

Mr. Lewis referred to page four of the document under the section Considerations for Proceeding With a Health Study, Ability to Provided Definitive Results. He was concerned that data could be collected and not yield definitive results.

Mr. Lewis made a formal request that the PHAWG figure out exactly what they are asking for and include this in their recommendations. He suggested that the PHAWG look at this guidance manual so that they can see what ATSDR is looking for in its determination at the site and consider this when making their recommendations.

Discussion


Kowetha Davidson commented that if nothing is found when the data are compared to two larger entities, then she does not believe there is anywhere further to go with the information. She stated that in the Memphis data, there were no elevated levels. Dee Williamson stated there was also no off-site exposure.

Dee Williamson thought it would be helpful to clarify different areas of ATSDR. She stated that Jack Hanley’s area of ATSDR conducts public health assessments. His task is to look at exposure to see if it is getting to people and if so, how is it reaching people. If exposure is occurring, then he looks at duration, pathway, and how many people are exposed. After this happens, the area of potential concern is given to the epidemiology department who decides if a study should be conducted based on specific criteria. She stated that certain criteria have to be met, including identifying an exposed population and conducting a study that is scientifically valid.

Kowetha Davidson reminded the PHAWG that ORRHES has asked for cancer registry data. If the numbers are not elevated when compared to other data, she cannot see where else they can go with the particular data. She said that even if an elevated level was not found, they would still have answered many peoples’ questions. Jack Hanley asked Dr. Davidson what they would do if there is an elevated rate, no exposure related to the increase, and no plausible outcome related to a contaminant of concern. Dr. Davidson responded that they may have to conduct a health study. Mr. Hanley then asked Dr. Davidson what to do if there is a contaminant of concern that is related to elevated rates of cancer. He stated that it is easy if the rates are not elevated, but if there is an elevated rate, then they have to decide how to handle the increase. He said that it is better to know on the “front end” how to handle a potential elevation. Mr. Hanley told the PHAWG that they need to have a “general path” to follow if the rates are elevated or not elevated.

Charles Washington said that if there was an elevation, then a study would naturally be continued. Jack Hanley then asked how it would be handled if there was no exposure associated with the disease. Dr. Davidson said it would have to be concluded that there was no exposure from the DOE site that is associated with the particular contaminant.

Dee Williamson reiterated that the cancer incidence data tell you what is currently going on in the community. She said this usually does not detail exposures from the 1950s, unless the people have lived there since that time. She stated that if there was an elevated number of cancers, this did not necessarily mean that it was caused from the site. Also, if the cancer rates were not elevated, it does not mean that there was no exposure from the site. Ms. Williamson told the group that even if rates were elevated, ATSDR would not do a study if there were no exposure data. However, she said that ATSDR could do other site-related activities.

Tim Joseph commented on the ability to provide definitive results. He thought that the PHAWG needed to ask if they want a study that is inconclusive by design. He said that they need to look at the end result and decide how they will handle the results before they start. Mr. Joseph also stated that they will not know if a cancer/disease is or is not caused by an exposure because they do not have exposure data. Kowetha Davidson commented that she does not see “having no exposure” as an inconclusive result. Mr. Joseph responded that they cannot say if there is or is not an exposure. He does not think that they will be able to determine if the cancer or disease prevalence could have been caused by on-site or off-site exposures. Dee Williamson added that they cannot link exposures from the site and cancer incidence.

Kowetha Davidson brought up that the ORRHES has requested cancer registry data on cancers that could possibly be associated to COCs. She is trying to get the PHAWG focused on the request of the ORRHES and to not make the situation more complicated than it needs to be.

Bob Craig concluded the discussion and stated that all of the PHAWG members needed to read the initial draft which will be discussed at the next meeting. He stated that the sub workgroup will consider these ATSDR procedures, make comments, and re-draft as appropriate.

Additional Comments/Concerns

General

James Lewis stated that he is concerned because he thinks that the PHAWG sometimes does not know what they are asking for. He wanted to know if they are asking for a health statistics review. Kowetha Davidson responded that the ORRHES has asked for cancer registry data for eight counties, with a focus on cancers that may be associated with the contaminants of concern. Dr. Davidson said that it would be alright to recommend that ORRHES compare the data for the state and nation. However, she stated if the PHAWG wanted to recommend that an additional study be conducted, she would not agree with this.

James Lewis asked George Gartseff what he thought was requested. Mr. Gartseff replied that he presumed that they would need information on cancer outcome related to COCs. He thought that the ORRHES was requesting cancer data to support this part of the assessment. He added that while they are looking at cancer data, he thought they should also look to see if there are any other cancers in the area (statewide or other) that are in excess and that may cause the group to revise the COC list. He said he does not want a health study, but is asking for a comparison study. He stated that he wants to document the reasons why certain things were not looked at, even if there were elevated levels.

Kowetha Davidson agreed with George Gartseff and said that an elevation cannot be determined unless it is compared against something. Bob Craig added that the definition of a health survey is that cancers will be surveyed and a comparison will be made.
Kowetha Davidson thought that the PHAWG needs to “keep it simple” and not focus on the type of study that ATSDR conducts. She said that raw cancer registry data should be taken and compared to state and national data. Dee Williamson added that cancers should be chosen according to concerned persons and not based on COCs from the site.

Screening Level

Bob Craig asked Jack Hanley if he had discussed the 71 mrem/y write-up with Paul Charp. Mr. Hanley responded that he had briefly mentioned it to Dr. Charp, but that he will talk to him again on the topic.

New Business

The Chair asked the PHAWG if there was any new business. There was no new business. The next PHAWG meeting is scheduled for January 21, 2003.

Update on the Public Health Assessment and Project Plan

Presenter: Jack Hanley, ATSDR

Summary

Jack Hanley gave an update on the Uranium Public Health Assessment that was completed by December 31, 2002. Mr. Hanley stated that the PHAs should be mailed out within the next week. He is requesting comments from the PHAWG be sent to Tony Malinauskas who will compile all of the comments. Mr. Hanley is anticipating that group comments from the PHAWG will be sent to the ORRHES and then to ATSDR. He said that this initial release is usually sent to government agencies, but he would like the PHAWG’s and ORRHES’s input and ideas. In addition, the U.S. Environmental Protection Agency (EPA), DOE, and the Tennessee Health Department will review the document. After these comments are addressed, the public comment release PHA be issued.

Following Mr. Hanley’s comments, the meeting was adjourned.


 
Contact Us:
  • Agency for Toxic Substances and Disease Registry
    4770 Buford Hwy NE
    Atlanta, GA 30341-3717 USA
  • 800-CDC-INFO
    (800-232-4636)
    TTY: (888) 232-6348
    Email CDC-INFO
  • New Hours of Operation
    8am-8pm ET/Monday-Friday
    Closed Holidays
USA.gov: The U.S. Government's Official Web PortalDepartment of Health and Human Services
Agency for Toxic Substances and Disease Registry, 4770 Buford Hwy NE, Atlanta, GA 30341
Contact CDC: 800-232-4636 / TTY: 888-232-6348

A-Z Index

  1. A
  2. B
  3. C
  4. D
  5. E
  6. F
  7. G
  8. H
  9. I
  10. J
  11. K
  12. L
  13. M
  14. N
  15. O
  16. P
  17. Q
  18. R
  19. S
  20. T
  21. U
  22. V
  23. W
  24. X
  25. Y
  26. Z
  27. #