Human Generated Data

Title

Untitled (three women wearing fur coats posing with trees in background)

Date

c. 1950

People

Artist: Jack Rodden Studio, American 1914 - 2016

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.13643

Human Generated Data

Title

Untitled (three women wearing fur coats posing with trees in background)

People

Artist: Jack Rodden Studio, American 1914 - 2016

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-04

Person 99.2
Human 99.2
Person 98.4
Clothing 97.2
Apparel 97.2
Person 96.7
Nature 93.6
Outdoors 85.9
Face 82.3
Monitor 72.1
Screen 72.1
Display 72.1
Electronics 72.1
People 68.9
Female 68.4
Snow 65.3
Weather 62.5
LCD Screen 56.4
Veil 55.8
Hat 55.6
Photo 55.5
Photography 55.5
Portrait 55.5

Imagga
created on 2022-02-04

television 100
telecommunication system 82.5
aquarium 25.6
broadcasting 24.5
car 19.4
telecommunication 18
man 16.1
person 16
screen 15.9
vehicle 15.9
adult 14.2
people 13.9
happy 13.8
portrait 13.6
male 13.5
sitting 12.9
hair 12.7
driver 12.6
black 12.6
pretty 12.6
transportation 12.5
drive 12.3
monitor 12
driving 11.6
smiling 11.6
automobile 11.5
window 11.1
frame 10.8
cute 10.8
medium 10.7
smile 10.7
face 10.6
travel 10.6
fun 10.5
eyes 10.3
one 9.7
outdoors 9.7
computer 9.6
film 9.6
happiness 9.4
business 9.1
technology 8.9
looking 8.8
couple 8.7
display 8.4
road 8.1
sexy 8
love 7.9
old 7.7
auto 7.7
fashion 7.5
human 7.5
equipment 7.5
electronic 7.5
inside 7.4
object 7.3
transport 7.3
office 7.2
modern 7

Google
created on 2022-02-04

Microsoft
created on 2022-02-04

monitor 97.1
television 95.7
text 91.8
window 90.4
wedding dress 86.4
human face 80.4
person 79.1
screen 77.6
clothing 76.9
old 73.9
screenshot 70.6
bride 65
woman 62
black and white 55.9
picture frame 51.5
posing 42.6
flat 34.7
image 33.2
painting 22.2

Face analysis

Amazon

Google

AWS Rekognition

Age 42-50
Gender Female, 98.5%
Sad 42.1%
Calm 40.2%
Confused 6.9%
Happy 3%
Fear 2.8%
Surprised 2.2%
Disgusted 1.5%
Angry 1.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.2%
Monitor 72.1%
Hat 55.6%

Captions

Microsoft

a flat screen tv sitting on top of a window 34.1%
a flat screen television 34%
a flat screen tv sitting in front of a window 33.9%

Text analysis

Amazon

This
ST
This S
S

Google

ST
This ST ST
This