Human Generated Data

Title

Untitled (young couple, girl primping in mirror of cigarette machine)

Date

1959

People

Artist: Bruce Davidson, American born 1933

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Mr. and Mrs. Barnabas McHenry, P1996.123

Copyright

© Magnum Photos, Inc. and Bruce Davidson

Human Generated Data

Title

Untitled (young couple, girl primping in mirror of cigarette machine)

People

Artist: Bruce Davidson, American born 1933

Date

1959

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Mr. and Mrs. Barnabas McHenry, P1996.123

Copyright

© Magnum Photos, Inc. and Bruce Davidson

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Person 99.8
Human 99.8
Person 99.7
Person 98.5
Person 97.4
Footwear 93.3
Clothing 93.3
Apparel 93.3
Shoe 93.3
Shorts 75.3
Leisure Activities 61.8
Poster 59.4
Advertisement 59.4
Pants 59.1
Photo 57.4
Photography 57.4
Finger 57
Face 55.8
Flooring 55.7

Clarifai
created on 2023-10-25

people 99.5
monochrome 98.8
group 96.4
woman 96.1
man 95.3
group together 95.2
music 95
adult 94.4
street 92.3
child 89.6
portrait 86.8
girl 86.7
adolescent 85.8
recreation 83.7
school 83.6
dancing 81.9
three 80.9
coverage 80.6
boy 80.2
guitar 79.8

Imagga
created on 2022-01-08

person 24.6
adult 23.4
people 22.9
man 19.5
black 18.8
hair 17.4
portrait 16.8
world 15.5
sexy 15.3
human 15
one 14.9
male 14.2
sitting 13.7
women 13.4
fashion 12.8
pretty 12.6
hair spray 12.4
urban 12.2
looking 12
body 12
happy 11.9
toiletry 11.9
dark 11.7
lifestyle 11.6
face 11.4
attractive 11.2
spectator 10.5
style 10.4
youth 10.2
alone 10
head 9.2
city 9.1
hand 9.1
business 9.1
water 8.7
model 8.6
two 8.5
skin 8.5
modern 8.4
studio 8.4
joy 8.4
silhouette 8.3
joyful 8.3
girls 8.2
wet 8
worker 8
newspaper 8
boss 7.6
passenger 7.6
relaxation 7.5
fun 7.5
sensual 7.3
sensuality 7.3
dress 7.2
job 7.1
working 7.1
businessman 7.1
glass 7

Google
created on 2022-01-08

Microsoft
created on 2022-01-08

text 99.2
person 97.2
clothing 94.3
black and white 91.7
poster 82.5
man 81
street 70.8

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 18-24
Gender Male, 100%
Calm 89.3%
Sad 7.5%
Angry 1.7%
Confused 0.5%
Surprised 0.3%
Fear 0.3%
Disgusted 0.2%
Happy 0.1%

AWS Rekognition

Age 20-28
Gender Female, 100%
Fear 50.6%
Calm 45.2%
Surprised 1.2%
Sad 1%
Angry 0.7%
Confused 0.5%
Happy 0.5%
Disgusted 0.3%

Microsoft Cognitive Services

Age 27
Gender Male

Microsoft Cognitive Services

Age 21
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.8%
Shoe 93.3%
Poster 59.4%

Categories

Imagga

paintings art 98.5%

Captions

Text analysis

Amazon

AREA
IN THIS AREA
IN
THIS
NO
NO DRINKING
DRINKING
CIGARET
55.
anne 55.
anne

Google

NO DRINKING IN THIS AREA CICARET
NO
DRINKING
IN
THIS
AREA
CICARET