Human Generated Data

Title

Untitled (girls pose on lawn in angel costumes)

Date

c. 1930-1940

People

Artist: Durette Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5718

Human Generated Data

Title

Untitled (girls pose on lawn in angel costumes)

People

Artist: Durette Studio, American 20th century

Date

c. 1930-1940

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2019-06-01

Human 99.4
Person 99.4
Person 98.1
Person 97.5
People 86.7
Advertisement 85.5
Poster 85.5
Person 84.9
Female 69.1
Outdoors 69
Nature 65.3
Leisure Activities 64.2
Text 62
Person 61.6
Apparel 59.2
Clothing 59.2
Shorts 59
Art 57.1
Crowd 55.7

Clarifai
created on 2019-06-01

people 99.4
group 97.9
child 96.6
illustration 95.2
art 95.1
man 94.8
adult 93.5
many 93
vintage 92.5
wear 92.1
old 92
desktop 89
retro 87.6
woman 86.7
crowd 84.9
antique 84.4
war 84.4
room 82.1
group together 82.1
monochrome 80.7

Imagga
created on 2019-06-01

newspaper 80.4
product 61.7
creation 48.1
daily 40.6
vintage 29
old 26.5
art 24.2
grunge 19.6
retro 17.2
antique 15.6
ancient 15.6
negative 15.2
film 14.4
decoration 14.3
history 14.3
architecture 13.3
letter 12.8
culture 12.8
money 12.8
paint 12.7
aged 12.7
texture 12.5
frame 12.5
black 12
postmark 11.8
sculpture 11.7
stamp 11.6
mail 11.5
symbol 11.4
building 11.3
historical 11.3
drawing 11.3
design 11.3
structure 11.2
paper 11
dirty 10.8
postage 10.8
city 10.8
postal 10.8
currency 10.8
mask 10.6
text 10.5
card 10.2
border 10
business 9.7
pattern 9.6
grungy 9.5
graphic 9.5
dollar 9.3
house 9.2
travel 9.2
circa 8.9
artistic 8.7
edge 8.7
bill 8.6
finance 8.5
memorial 8.4
monument 8.4
tourism 8.2
cash 8.2
closeup 8.1
bank 8.1
man 8.1
religion 8.1
envelope 8
statue 7.7
us 7.7
collage 7.7
rust 7.7
detailed 7.7
architectural 7.7
post 7.6
banking 7.4
digital 7.3
rough 7.3
people 7.3
landmark 7.2
computer 7.2
graffito 7.2
material 7.1
financial 7.1
screen 7.1

Google
created on 2019-06-01

Microsoft
created on 2019-06-01

text 87.6
person 85.6
clothing 77.1
drawing 53.7
old 52.5
picture frame 10.3

Face analysis

Amazon

AWS Rekognition

Age 10-15
Gender Female, 53.6%
Confused 46.8%
Calm 45.7%
Sad 47.3%
Surprised 46.3%
Angry 46.2%
Disgusted 46.3%
Happy 46.3%

AWS Rekognition

Age 12-22
Gender Female, 50.8%
Disgusted 45.7%
Calm 45.9%
Sad 48.1%
Confused 46.8%
Angry 47%
Surprised 46.2%
Happy 45.3%

AWS Rekognition

Age 48-68
Gender Female, 51.5%
Happy 45.3%
Disgusted 45.9%
Angry 45.8%
Surprised 45.4%
Sad 46.2%
Calm 51%
Confused 45.4%

AWS Rekognition

Age 20-38
Gender Female, 52.3%
Surprised 45.9%
Confused 46.5%
Disgusted 45.4%
Happy 45.3%
Sad 47.4%
Calm 48.4%
Angry 46.1%

AWS Rekognition

Age 35-52
Gender Female, 52.3%
Surprised 45.3%
Sad 45.3%
Happy 45.9%
Angry 45.5%
Disgusted 51.1%
Confused 45.1%
Calm 46.9%

AWS Rekognition

Age 48-68
Gender Male, 50.4%
Disgusted 45.4%
Happy 45.8%
Surprised 45.3%
Sad 48.6%
Angry 45.5%
Confused 45.4%
Calm 48.8%

Feature analysis

Amazon

Person 99.4%
Poster 85.5%

Captions

Microsoft

a group of people posing for a photo 55.7%
a group of people posing for a photo in front of a window 44.7%
an old photo of a person 44.6%

Text analysis

Amazon

ELHL