Comprehensive Library Of Resveratrol News

Subscribe to our newsletter to receive email notifications when new articles are posted.


    January 20, 2012: by admin

    As one can learn from the following 2006 report in NATURE METHODS, western blot tests are a perfect tool to out any vulnerable scientist.

    According to the following report, 25% of accepted manuscripts contained at least one “inappropriately manipulated figure,” but obviously the authors of these papers did not all face expulsion from their institutions as did Dr. Das. This report in Nature Methods does not say that editors also reject papers where western blot images that are not of sufficient quality to be reproduced, forcing authors to enhanced the images. But it should. Furthermore, apparently most of these cases involve researchers who did not have intent to deceive.

    The question arises, why didn’t the editors of the 11 journals involving 26 papers submitted by Dr. Das ever catch these “altered images” in their peer review and scientific integrity efforts? It seems to me researchers should submit raw images and let the journals take all the ethical criticism rather than risk their careers over this. Would firing 25% of the researchers fix the problem? Obviously no. But it’s OK to pillory Dr. Das.

    The western blot test is a political can of worms. There have been dismissals and sanctions issued over western blot images for a number of years now and the scientific community is no closer to resolving the problem than when the issue first came to light a few years back. Today we have better though more expensive tests that can be performed, to measure protein levels in tissues. The most sophisticated is microRNA. NIH researchers validated by microRNA Dr. Das’ work which involved the use of resveratrol in excised rodent hearts that were subjected to experimental heart attack. Resveratrol restored the normal microRNA gene expression pattern following a heart attack in animals.

    The western blot test was only one of other tests performed to demonstrate the benefits of resveratrol. The most remarkable were direct photo images of the heart which showed that resveratrol reduced the area of scarring (fibrosis) following an experimental heart attack. These photo images themselves provide observable evidence that resveratrol protects the heart prior to a heart attack and can turn a mortal heart attack into a non-mortal event. This is jaw-dropping science.

    Given that aspirin was recently found to be ineffective at reducing the risk for mortal heart attacks, cardiology should be focusing on resveratrol. The release of accusations against the leading resveratrol researcher in cardiology was perfectly timed. In the 8 years since a Harvard professor indicated resveratrol is a key molecule in red wine responsible for the French Paradox (the fact the wine-drinking French have a much lower rate of coronary artery disease mortality despite their high fat diet and high cholesterol levels), not one human study involving resveratrol in cardiology has ensued. The higher crime appears to be foot dragging by modern medicine. – Bill Sardi, managing partner, Resveratrol Partners LLC, dab LONGEVINEX

    Nature Methods3, 237 (2006)

    A picture worth a thousand words (of explanation)

    While the dust slowly settles over a staggering case of scientific fraud, a bitter aftertaste lingers with scientists and editors alike—that of having been deceived by grossly manufactured evidence. It is clear that fraud is a shameful exception, but this climate may be conducive to a little reflection on some less extreme practices of figure manipulation, which—if seemingly more innocuous—are largely more common. With current processing software, a few clicks of the mouse make possible a spectrum of image manipulations, from innocent embellishment to scientific misconduct. Nature journals have prepared new guidelines in an attempt to clarify boundaries of acceptability in preparing images for publication (

    We do believe that in the vast majority of cases intentions are good: authors seek to present data more clearly. But good intent does not make all practices acceptable, and the few numbers available to quantify the unacceptable are surprisingly high.

    Particularly informative are the statistics gathered by the Journal of Cell Biology, which for almost 4 years has been applying a systematic search for image manipulation in its accepted papers before publication (The Scientist 20, 24; 2006). This scrutiny led to 1% of accepted papers to be revoked on the grounds that image manipulation affected the interpretation of data. Surprisingly, 25% of accepted manuscripts contained at least one inappropriately manipulated figure for which a satisfactory replacement could be obtained from the authors upon further investigation. These rates have not declined since the policy was implemented. Such numbers together with anecdotal evidence suggest that a large proportion of authors are not aware of what does and does not constitute inappropriate image manipulation.

    The new Nature journal guidelines are an attempt to clarify these limits. Their principle is that no modification can be made that selectively affects only a portion of the image, removes information or adds information obtained in a different experiment. Even without affecting the paper’s conclusion, such modifications may have consequences. For example, hiking up the contrast of a western blot image to decrease the appearance of background will provide the community with a false idea of the antibody quality. Moreover, some observations that do not appear to make sense in the context of the current body of knowledge may turn out to be logical once the biology of the system is understood. Removing such peripheral information from images today will lead to contradictions tomorrow.

    Thanks to electronic submission and publication processes, much more information may be presented to editors, reviewers and ultimately readers. When modifications are unavoidable, authors should provide a clear description of the manipulation in the figure legend and are encouraged to provide original images as supplementary information.

    Distortion of data can also occur before the figures are prepared, that is, at the time of image acquisition. This is particularly true for fluorescence microscopy (Nat. Methods 2, 889; 2005). Unless properly trained, many researchers may not fully understand the consequences of adjusting instrument settings. To allow readers to fully comprehend the context of an experiment, we request that the parameters affecting image acquisition be recorded in the paper. Additionally, Nature Methods has recently published a special focus issue ( with the goal of providing a reference of best practices for new users of fluorescence microscopy.

    Several documented cases of published papers containing manipulated images, whether intentionally deceptive or not, have reflected a disconnect between the people who acquire the results and those who report them (Nature 434, 952; 2005). When even a single person distorts evidence unbeknownst to their coauthors, the reputation of honest, unsuspecting scientists is at stake. To avoid such damaging situations, corresponding authors need to acknowledge their responsibility to be accountable for the scientific veracity of the work.

    From now on, Nature Methods will also be requesting, upon conditional acceptance of a paper, a statement by the corresponding author assuring that the figures provided for publication accurately represent the original data in agreement with our guidelines. This will be viewed as bureaucratic nonsense by some, but nevertheless, we hope it will promote a dialog within research groups and improve awareness of the acceptable limits of image modification.

    We encourage you to read these guidelines, discuss them with your peers and perhaps reconsider some practices that you have been applying without contemplating their consequences. We hope that this is a solid step forward in dismantling the myth that more-than-perfect images are needed to gain the approval of reviewers and editors.


    Images submitted with a manuscript for review should be minimally processed (for instance, to add arrows to a micrograph). Authors should retain their unprocessed data and metadata files, as editors may request them to aid in manuscript evaluation. If unprocessed data are unavailable, manuscript evaluation may be stalled until the issue is resolved. All digitized images submitted with the final revision of the manuscript must be of high quality and have resolutions of at least 300 dpi.

    A certain degree of image processing is acceptable for publication (and for some experiments, fields and techniques is unavoidable), but the final image must correctly represent the original data and conform to community standards. The guidelines below will aid in accurate data presentation at the image processing level; authors must also take care to exercise prudence during data acquisition, where misrepresentation must equally be avoided. Manuscripts should include a single Supplementary Methods file (or a subsection of a larger Supplementary Methods file) labeled ‘equipment and settings’ that describes for each figure the pertinent instrument settings, acquisition conditions and processing changes, as described in this guide.

    1. Authors should list all image acquisition tools and image processing software packages used.
    2. Authors should document key image-gathering settings and processing manipulations in the Supplementary Methods.
    3. Images gathered at different times or from different locations should not be combined into a single image, unless it is stated that the resultant image is a product of time-averaged data or a time-lapse sequence. If juxtaposing images is essential, the borders should be clearly demarcated in the figure and described in the legend.
    4. The use of touch-up tools, such as cloning and healing tools in Photoshop, or any feature that deliberately obscures manipulations, is to be avoided.
    5. Processing (such as changing brightness and contrast) is appropriate only when it is applied equally across the entire image and is applied equally to controls. Contrast should not be adjusted so that data disappear. Excessive manipulations, such as processing to emphasize one region in the image at the expense of others (for example, through the use of a biased choice of threshold settings), is inappropriate, as is emphasizing experimental data relative to the control.
    6. When submitting revised final figures upon conditional acceptance, authors may be asked to submit original, unprocessed images.

Leave a Reply

In order to submit your comment please complete the calculation below:

Time limit is exhausted. Please reload CAPTCHA.