RSS
 

Posts Tagged ‘Image’

Researchers release free AI-powered Fawkes image privacy tool for ‘cloaking’ faces

13 Aug

Researchers with the University of Chicago’s SAND Lab have detailed the development of a new tool called Fawkes that subtly alters images in a way that makes them unusable for facial recognition. The tool comes amid growing concerns about privacy and an editorial detailing the secret scraping of billions of online images to create facial recognition models.

Put simply, Fawkes is a cloaking tool that modifies images in ways imperceptible to the human eye. The idea is that anyone can download the tool, which has been made publicly available, to first cloak their images before posting them online. The name was inspired by Guy Fawkes, the mask of whom was popularized by the movie V for Vendetta.

The Fawkes algorithm doesn’t prevent a facial recognition algorithm from analyzing a face in a digital image — instead, it teaches the algorithm a ‘highly distorted version’ of what that person’s face looks like without triggering errors; it cannot, the researchers say, be ‘easily detected’ by the machines, either.

By feeding the algorithm these cloaked images, it subtly disrupts the machine’s attempt to learn that person’s face, making it less capable of identifying them when presented with uncloaked imagery. The researchers claim their cloaking algorithm is ‘100% effective’ against top-tier facial recognition models, including Amazon Rekognition and Microsoft Azure Face API.

As well, the team says their disruption algorithm has been ‘proven effective’ in many environments through extensive testing. The use of such technology would be far more subtle and difficult for authorities to prevent compared to more conventional concepts like face painting, IR-equipped glasses, distortion-causing patches or manual manipulation of one’s own images.

These conspicuous methods are known as ‘evasion attacks,’ whereas Fawkes and similar tools are referred to as ‘poison attacks.’ As the name implies, the method ‘poisons’ the data itself so that it ‘attacks’ deep learning models that attempt to utilize it, causing more widespread disruption to the overall model.

The researchers note that Fawkes is more sophisticated than a mere label attack, saying the goal of their utility is ‘to mislead rather than frustrate.’ Whereas a simple corruption of data in an image could make it possible for companies to detect and remove the images from their training model, the cloaked images imperceptibly ‘poison’ the model in a way that can’t be easily detected or removed.

As a result, the facial recognition model loses accuracy fairly quickly and its ability to detect that person in other images and real-time observation drops to a low level.

Yes, that’s McDreamy.

How does Fawkes achieve this? The researchers explain:

‘DNN models are trained to identify and extract (often hidden) features in input data and use them to perform classification. Yet their ability to identify features is easily disrupted by data poisoning attacks during model training, where small perturbations on training data with a particular label can shift the model’s view of what features uniquely identify …

But how do we determine what perturbations (we call them “cloaks”) to apply to [fictional example] Alice’s photos? An effective cloak would teach a face recognition model to associate Alice with erroneous features that are quite different from real features defining Alice. Intuitively, the more dissimilar or distinct these erroneous features are from the real Alice, the less likely the model will be able to recognize the real Alice.’

The goal is to discourage companies from scraping digital images from the Internet without permission and using them to create facial recognition models for unaware people, a huge privacy issue that has resulted in calls for stronger regulations, among other things. The researchers point specifically to the aforementioned NYT article, which details the work of a company called Clearview.ai.

According to the report, Clearview has scraped more than three billion images from a variety of online sources, including everything from financial app Venmo to obvious platforms like Facebook and less obvious ones like YouTube. The images are used to create facial recognition models for millions of people who are unaware of their inclusion in the system. The system is then sold to government agencies who can use it to identify people in videos and images.

Many experts have criticized Clearview.ai for its impact on privacy and apparent facilitation of a future in which the average person can be readily identified by anyone with the means to pay for access. Quite obviously, such tools could be used by oppressive governments to identify and target specific individuals, as well as more insidious uses like the constant surveillance of a population.

By using a method like Fawkes, individuals who possess only basic tech skills are given the ability to ‘poison’ the unauthorized facial recognition models trained specifically to recognize them. The researchers note that there are limitations to such technologies, however, making it tricky to sufficiently poison these systems.

One of these images has been cloaked using the Fawkes tool.

For example, the person may be able to cloak images they share of themselves online, but they may find it difficult to control images of themselves posted by others. Images posted by known associates like friends may make it possible for these companies to train their models, though it’s unclear whether there exists the ability to quickly located people in third-party images (for training purposes) in an automated fashion and at a mass scale.

Any entity that is able to gather enough images of the target could train a model sufficiently enough that a minority of cloaked images fed into it may be unable to substantially lower its accuracy. Individuals can attempt to mitigate this by sharing more cloaked images of themselves in identifiable ways and by taking other steps to reduce one’s uncloaked presence online, such as removing name tags from images, using ‘right to be forgotten’ laws and simply asking friends and family to refrain from sharing images of one’s self online.

Another limitation is that Fawkes — which has been made available to download for free Linux, macOS and Windows — only works on images. This means it is unable to offer cloaking for videos, which can be downloaded and parsed out into individual still frames. These frames could then be fed into a training model to help it learn to identify that person, something that becomes increasingly possible as consumer-tier camera technology offers widespread access to high-resolution and high-quality video recording capabilities.

Despite this limitation, Fawkes remains an excellent tool for the public, enabling the average person with access to a computer and the ability to click a couple of buttons to take more control over their privacy.

A full PDF of the Fawkes image-cloaking study can be found on the SAND Lab website here.

Articles: Digital Photography Review (dpreview.com)

 
Comments Off on Researchers release free AI-powered Fawkes image privacy tool for ‘cloaking’ faces

Posted in Uncategorized

 

Unsplash releases massive open-source image dataset with 2M high-quality photos

07 Aug

Unsplash, a website that enables anyone to share high-quality images under a Creative Commons Zero license, has announced the release of what it says is the ‘most complete high-quality open image dataset ever.’ The dataset contains more than 2,000,000 images, according to Unsplash, which sourced the images from more than 200,000 photographers around the world.

An image dataset is a collection of images that can be downloaded as a full batch; they contain relevant details, such as EXIF data, location information and more. In this case, Unsplash says that its dataset includes data on AI- and community-generated keywords for the images, landmark details when relevant, image categories and subcategories, download stats, the number of image views, groupings of images, user-generated collections and ‘keyword-image conversions in search results.’

All data included with the dataset is anonymized and private, with the only exception being attribution to photographers. Unsplash says that it sourced the data from ‘hundreds of millions [of] searches across a nearly unlimited number of uses and contexts.’

The ‘complete’ nature of this dataset distinguishes it from other open-source image datasets, which Unsplash notes often have various issues, such as relying on mass image labeling from third-parties, the use of low-quality images, size limitations and other issues that may limit their usefulness.

In its present form, the dataset is 16GB in size, but Unsplash says that it will continue updating the dataset with additional images and fields as its online library grows.

The dataset is available to download from a dedicated portal on the Unsplash website, where two download options are available: the full high-quality 16GB dataset, which is offered only for non-commercial use, and a ‘Lite’ version that is only 550MB and available for both non-commercial and commercial use.

The full dataset contains more than 2,000,000 images, 5,000,000 keywords and 250,000,000 searches. The ‘Lite’ data is limited to 25,000 images and keywords, as well as 1,000,000 searches. Whereas the Lite dataset is available for anyone to download, the full dataset requires users to request permission to download.

The company requires certain details from the user as part of their request, including name, email and the intended use of the data. In addition to the dedicated download website, Unsplash has published the related documentation on Github.

Unsplash remains as controversial as it is popular. The website has been integrated into a number of services, including Adobe, Trello, Wix, Medium, Facebook and thousands of other platforms. The service is distinguished from other free photo platforms by the high-quality nature of the images available to the public under a CC0 license, making them available for non-commercial and commercial use.

Professional photographers have criticized the platform as undermining the profession and photographers who contribute images as devaluing their work, among other things. Back in 2017, Unsplash founder Mikael Cho attempted to address these concerns in a blog post, stating, ‘We didn’t start Unsplash to reinvent an industry. We started Unsplash because we thought it might be useful.’

Articles: Digital Photography Review (dpreview.com)

 
Comments Off on Unsplash releases massive open-source image dataset with 2M high-quality photos

Posted in Uncategorized

 

Canon image hosting platform, image.canon, temporarily shut down after loss of users content

03 Aug

Over the weekend, Canon’s cloud media hosting platform, image.canon, suffered an outage that left users unable to login and use the service. No specific information was provided over the weekend, but we now know what went wrong.

In a statement shared on the image.canon homepage, Canon confirmed there’s been an issue with its long-term storage on image.canon that’s resulted in the loss of original image and video uploads. The full notice reads as follows:

Important Notice
Thank you for using image.canon.
On the 30th of July, we identified an issue within the 10GB long term storage on image.canon. Some of the original photo and video data files have been lost. We have confirmed that the still image thumbnails of the affected files have not been affected.
In order to conduct further review, we have temporarily suspended both the mobile app and web browser service of image.canon.
Information regarding the resumption of service and contact information for customer support will be made available soon.
There has been no leak of image data.
We apologize for any inconvenience.

To prevent any further issues, Canon has temporarily shut down both the mobile and web app versions of image.canon. Per the notice, we should have further updates ‘soon.’ We will update this article when further updates are provided.

Articles: Digital Photography Review (dpreview.com)

 
Comments Off on Canon image hosting platform, image.canon, temporarily shut down after loss of users content

Posted in Uncategorized

 

Astronomers capture first-ever image of two exoplanets orbiting a 17M year-old Sun-like star

27 Jul

The European Southern Observatory’s Very Large Telescope (ESO VLT) has captured the first-ever image that captures two exoplanets orbiting a Sun-like star.

As the ESO explains in its blog post on the impressive feat, observing systems with multiple exoplanets is ‘extremely rare’ and, until this image, astronomers had never ‘directly observed’ multiple planets orbiting a young star.

Credit: ESO/Bohn et al.

In this groundbreaking image, captured by the SPHERE instrument onboard the ESO VLT, two ‘giant’ exoplanets are shown orbiting the star TYV 8998–760–1, which is estimated to be 17 million years old. Scientists captured the image by using a coronagraph to block the light from the young star, allowing for the light bouncing off the fainter planets to be seen.

The two gas giants are approximately 160 and 320 times as far away from their host star as the Earth is to the Sun. ‘This places these planets much further away from their star than Jupiter or Saturn, also two gas giants, are from the Sun; they lie at only 5 and 10 times the Earth-Sun distance, respectively,’ reads the blog post.

This chart shows the location of the TYC 8998-760-1 system. This map shows most of the stars visible to the unaided eye under good conditions and the system itself is marked with a red circle.

You can find information on this image and future findings by heading over the the ESO website.

Articles: Digital Photography Review (dpreview.com)

 
Comments Off on Astronomers capture first-ever image of two exoplanets orbiting a 17M year-old Sun-like star

Posted in Uncategorized

 

Mashable embedded image copyright case revived over surprising Facebook statement

26 Jun

The 2016 copyright infringement case against the media website Mashable that we last heard about in April is back again. Following a similar case with an opposite ruling regarding how copyright infringement may pertain to embedded Instagram posts, the US District Court for the Southern District of New York has reopened the copyright suit filed by photographer Stephanie Sinclair against Mashable.

Sinclair’s lawsuit is part of a copyright spat between the photographer and Mashable after the website embedded one of her Instagram posts in a 2016 article titled ’10 female photojournalists with their lenses on social justice.’ Mashable had first reached out to Sinclair and offered $ 50 to license the image, an offer that she rejected. As an apparent loophole to this matter, Mashable then simply embedded Sinclair’s public Instagram post featuring the same image.

A screenshot of the article in question. Sinclair’s Instagram photo has since been removed.

In her lawsuit, Sinclair had argued that Mashable did not have permission nor a license to use the image, while Mashable countered that it didn’t need the photographer’s permission because Instagram’s terms covered sublicensing. Instagram’s terms of service stated at the time that users:

…hereby grant to [Instagram] a non-exclusive, royalty-free, transferable, sub-licensable, worldwide license to host, use, distribute, modify, run, copy, publicly perform or display, translate, and create derivative works of your content (consistent with your privacy and application settings). You can end this license anytime by deleting your content or account.’

Based on its understanding of those terms, the court ruled against Sinclair, stating in April that, ‘Mashable was within its rights to seek a sublicense from Instagram when Mashable failed to obtain a license directly from Plaintiff…’

However, Instagram’s parent company Facebook introduced a plot twist earlier this month when it clarified in relation to a different but similar case against Newsweek that its terms do not cover sublicensing for embedded images. According to Facebook, and despite the fact that Instagram offers a ‘share’ function on public images by default, users must first get permission from the photographer before embedding their image.

This unexpected turn of events was a bittersweet moment, offering reassurance that Instagram users have more control over their images than previously thought, but with major implications for how future digital copyright cases are handled. Users who are unaware of the intricacies of Instagram’s terms could, for example, be liable for copyright infringement by simply using the feature made available to them by the platform.

Facebook’s statement has prompted the reopening of Sinclair’s copyright case, as the ruling in favor of Mashable was made with the understanding that Instagram’s terms covered sublicensing for embedded images. Sinclair filed a motion for reconsideration with the court in light of the new information, a request that has since been granted.

The case has been reopened because, according to presiding judge Kimba Wood, Mashable didn’t get ‘explicit consent’ from Instagram to embed the photo under its sublicensing terms. The lawsuit against Mashable can proceed, with Judge Wood stating in the court’s Opinion & Order that:

Revising its previous holding, the Court holds that the pleadings contain insufficient evidence to find that Instagram granted Mashable a sublicense to embed Plaintiff’s Photograph on its website … the Court did not give full force to the requirement that a license must convey the licensor’s “explicit consent” to use a copyrighted work.

The two new cases over Instagram embedding and how it pertains to copyright has renewed criticism of the platform for failing to give users more control over their content. Instagram automatically presents a sharing feature on all public Instagram posts, yet has made it clear that it doesn’t sublicense content shared with this feature, putting users at risk of liability.

Photographers are given the choice to make their images private, therefore removing the embed function, but with the consequence of reduced exposure to potential clients and customers. Enabling photographers to manually choose whether the sharing function is enabled on their public posts would remove this issue, but is not something Instagram presently offers.

In a statement to Ars Technica, Instagram had addressed this topic by stating that it was ‘considering the possibility’ of adding a new feature that would allow users to decide whether others can embed their public images. The non-committal nature of the statement, however, indicates that Instagram may never proceed to introduce such modification to this feature, putting the burden on photographers and users to sort out the copyright implications of using it.

Articles: Digital Photography Review (dpreview.com)

 
Comments Off on Mashable embedded image copyright case revived over surprising Facebook statement

Posted in Uncategorized

 

Video: How scratch-proof is a Sony a6000 image sensor?

23 Jun

If you’re a Sony user with a weak stomach, you might want to look away for this one. Photographer and YouTuber Arthur R shared a video this past week that looks at just how scratch-proof an image sensor is.

Admittedly, this isn’t the most scientific of tests, as he’s using a scrap sensor and doesn’t put it back into the camera to see if any damage not visible to the eye is affecting image quality, but it’s an interesting test nonetheless. Using small tape, Arthur divides the sensor — taken from a Sony a6000 — into four quadrants and uses four different mediums to test the durability of the sensor: dust, dirt, oils and a knife.

As Arthur details in the ten-minute video, the durability of the sensor is impressive, at least to the naked eye. Dust, dirt and oils didn’t show any noticeable markings and even the box cutter abuse only yielded a few scratches. Granted, it’s possible smaller scratches that could affect image quality may be able to be seen with a microscope, it still came out better than he expected.

You can find more videos from Arthur R over on his YouTube Channel.

Articles: Digital Photography Review (dpreview.com)

 
Comments Off on Video: How scratch-proof is a Sony a6000 image sensor?

Posted in Uncategorized

 

DPReview TV: How much do scratches, dust and fingerprints affect lens image quality?

18 Jun

We all know that damage to your lens is bad, but just how bad is bad? Chris and Jordan investigate the image quality impact of dust, water, fingerprints and cringe-inducing scratches on your lens. As you might imagine, the results range from ‘barely noticeable’ to, well, much worse than that.

Subscribe to our YouTube channel to get new episodes of DPReview TV every week.

  • Introduction
  • Fingerprints
  • Mist and water droplets
  • Dust
  • Light scratches
  • Deep scratches
  • What we learned

Sample images from this episode

$ (document).ready(function() { SampleGalleryV2({“containerId”:”embeddedSampleGallery_4393505357″,”galleryId”:”4393505357″,”isEmbeddedWidget”:true,”selectedImageIndex”:0,”isMobile”:false}) });

Articles: Digital Photography Review (dpreview.com)

 
Comments Off on DPReview TV: How much do scratches, dust and fingerprints affect lens image quality?

Posted in Uncategorized

 

Samsung reported to be expanding its image sensor production line

01 Jun

According to Business Korea, Samsung is planning to convert one of its current DRAM manufacturing lines into a camera sensor production line to the tune of roughly $ 815M.

The report says Samsung will be converting its DRAM line 13 in Hwaseong, Gyeonggi Province, South Korea into a camera sensor production unit. This transition isn’t unprecedented, as Samsung did the same back in 2018 with its DRAM line 11, converting it into the camera sensor production line S4.

This transition from DRAM production to camera sensor production happens because, as noted by Android Headline, approximately 80% of the manufacturing processes and equipment for the two operations overlap. so, rather than build from scratch, Samsung can save a dramatic sum of money by simply converting an existing production line.

Despite so much overlap between the two processes, the conversion is set to cost one trillion Korean won (?), which is roughly $ 815M dollars at the current exchange rate.

Business Korea doesn’t note what kind of sensors the converted production line will manufacture, but Samsung recently showed off its new 50MP ISOCELL GN1 sensor and has also teased the development of both a 150MP sensor and a 250MP sensor, with hopes to someday create a 600MP smartphone sensor.

Android Headline cites ‘industry experts,’ saying mass production on the converted line could start as early as year’s end, ‘once it completes installing and testing the new equipment.’

Articles: Digital Photography Review (dpreview.com)

 
Comments Off on Samsung reported to be expanding its image sensor production line

Posted in Uncategorized

 

Samsung is aiming to develop 600MP image sensors

21 Apr

In the last couple of years or so we have seen the size of image sensors in high-end smartphones increase quite dramatically. At the same time pixel counts have skyrocketed, driven, at least in part, by the use of pixel-binning technology to capture images with lower noise levels and a wider dynamic range than would be possible with conventional sensor technology.

The sensor in the main camera of Samsung’s latest flagship smartphone, the Galaxy S20 Ultra, is a prime example for both these trends. At 1/1.33″ it’s one of the currently largest (only the 1/1.28″ chip in the Huawei P40 Pro is bigger) and a whopping 108MP resolution allows for pixel binning and all sorts of computational imaging wizardry to produce 12MP high-quality default output.

In terms of pixel binning this latest Samsung sensor taken thins even one step further than previous generations. Instead of four it combines 9 pixels into one for an effective pixel size of 2.4µm.

Now we’ve learned that the South Korean company has no intentions to stop there. In a blog post on the company website, Samsung’s Head of Sensor Business Team Yongin Park explains that it is the company’s goal to design and produce image sensors that go beyond the resolution of the human eye which is said to be around 500MP.

However, Yong is aware that numerous challenges have to be overcome to achieve this goal.

‘In order to fit millions of pixels in today’s smartphones that feature other cutting-edge specs like high screen-to-body ratios and slim designs, pixels inevitably have to shrink so that sensors can be as compact as possible.

On the flip side, smaller pixels can result in fuzzy or dull pictures, due to the smaller area that each pixel receives light information from. The impasse between the number of pixels a sensor has and pixels’ sizes has become a balancing act that requires solid technological prowess,’ he writes.

Launched in 2013, Samsung’s ISOCELL technology has been paramount in allowing for more and more pixels to be implemented on smartphone image sensors, by isolating pixels from each other and thus reducing light spill and reflections between them. first this was done using metal ‘barriers’. Later generations used an unspecified ‘innovative material’.

Tetracell technology came along in 2017 and used 2×2 pixel binning to increase the effective pixel size. It was superseded by the company’s Nonacell tech and its 3×3 pixel arrays earlier this year. At the same time Samsung engineers were also able to reduce pixel size to a minuscule 0.7?m. According to Park this was previously believed to be impossible.

So, what can we expect from Samsung’s sensor division in the medium and long term? Park says that the company is ‘aiming for 600MP for all’ but doesn’t provide much detail on how this could be achieved. These sensors would not necessarily be exclusive to use in smartphones, however, and could be implemented for a wide range of applications.

‘To date, the major applications for image sensors have been in the smartphones field, but this is expected to expand soon into other rapidly-emerging fields such as autonomous vehicles, IoT and drones,’ he explains.

In addition the company is looking at applications for its sensors that go beyond photography and videography. According to Park, sensors that are capable of detecting wavelengths outside of the range of human eyes are still rare, but could benefit in areas such as cancer diagnosis in medicine or quality control in agriculture.

Articles: Digital Photography Review (dpreview.com)

 
Comments Off on Samsung is aiming to develop 600MP image sensors

Posted in Uncategorized

 

New York court rules website didn’t violate image copyright by embedding Instagram post

16 Apr

A Manhattan federal court has dismissed professional photojournalist Stephanie Sinclair’s copyright claim against digital media website Mashable, ruling that it did not violate her copyright by embedding one of her Instagram posts on its website. The legal issue arose in 2016 when Mashable published an article on female photographers whose work includes the topic of social justice, putting Sinclair at #9 on its list.

According to court documents, Mashable contacted Sinclair in March 2016 and offered to pay $ 50 to license one of her images for use in its article on female photographers. Sinclair declined the offer, so Mashable instead embedded an Instagram post of the image that Sinclair had published on her public Instagram account.

Fast-forward to January 2018 when, according to the court documents, Sinclair contacted Mashable and demanded that they remove the embedded post from the article on the grounds of copyright infringement. Mashable refused to remove the Instagram post and 10 days later, Sinclair filed a copyright lawsuit against the publication and its parent company Ziff Davis, LLC.

The lawsuit raised questions over Instagram’s Terms of Service, its right to grant sublicenses for images uploaded to its platform, and whether sharing and embedding public social media posts without permission or a direct image license constitutes copyright infringement.

Instagram states in its Terms of Use that while it does not claim ownership of a user’s images, they grant the company a license to use it when they upload the content to the platform. Instagram says that when a user uploads images to its website…

‘…you hereby grant to us a non-exclusive, royalty-free, transferable, sub-licensable, worldwide license to host, use, distribute, modify, run, copy, publicly perform or display, translate, and create derivative works of your content (consistent with your privacy and application settings). You can end this license anytime by deleting your content or account.’

Mashable argued that based on that Terms of Use, it had a valid sublicense from Instagram that allowed it to embed the image post on its website. The defendant argued, among other things, that:

‘…because Plaintiff uploaded the Photograph to Instagram and designated it as “public,” she agreed to allow Mashable, as Instagram’s sublicensee, to embed the Photograph in its website.’

Sinclair’s legal claim countered this, according to court documents, which cite multiple arguments, including a claim that because Mashable didn’t get a direct image license from the photographer, it shouldn’t have been able to get a sublicense for the content from Instagram. The court disagreed with that argument, however, with U.S. District Court Judge Kimba Wood noting:

‘Plaintiff’s right to grant a license directly to Mashable, and Instagram’s right, as Plaintiff’s licensee, to grant a sublicense to Mashable, operate independently. Mashable was within its rights to seek a sublicense from Instagram when Mashable failed to obtain a license directly from Plaintiff—just as Mashable would be within its rights to again seek a license from Plaintiff, perhaps at a higher price, if Plaintiff switched her Instagram account to “private” mode.’

As well, Sinclair had argued that it is ‘unfair’ that a platform like Instagram is able to force professional photographers to choose between keeping their accounts private or allowing the company to sublicense their publicly shared content because it is ‘one of the most popular public photo-sharing platforms in the world.’

Judge Wood acknowledges the nature of this issue, but ultimately states that:

‘Unquestionably, Instagram’s dominance of photograph- and video-sharing social media, coupled with the expansive transfer of rights that Instagram demands from its users, means that Plaintiff’s dilemma is a real one. But by posting the Photograph to her public Instagram account, Plaintiff made her choice. This Court cannot release her from the agreement she made.’

The copyright claim was ultimately dismissed, a conclusion that contrasts the ruling from a New York court in early 2018 on the case of an embedded tweet that featured an image of athlete Tom Brady.

In that case, the court found that embedding such tweets may constitute copyright infringement and the fact they were uploaded to a third-party server like Twitter didn’t change that. The basis of the latest ruling is different, however, focusing on the terms of use the photographer agreed to rather than the ‘server test’ used in the 2018 copyrighted tweet case.

Both of these legal claims follow a different legal case from 2007 in which the precedent was set for how the Internet of today operates: that a person or company who embeds content hosted by a third-party source like Facebook or Twitter are not in violation of copyright, but rather that the hosting company itself is liable.

DPReview contacted Mickey Osterreicher, NPPA’s general counsel, for comment. He had the following to say about this New York ruling:

‘I have not had an opportunity to review the court’s opinion and order in this case so I do not feel it appropriate for me to comment. I will repeat something that NPPA has stressed for many years – photographers read and understand the terms of service or the terms of use on each and every social media platform before agreeing to them or posting on those sites. They also must continue to vigilantly monitor those terms as they are frequently changed and updated.’

Articles: Digital Photography Review (dpreview.com)

 
Comments Off on New York court rules website didn’t violate image copyright by embedding Instagram post

Posted in Uncategorized