Human Generated Data

Title

Untitled (man with two girls at soda fountain)

Date

c. 1940

People

Artist: John Deusing, American active 1940s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.1516

Human Generated Data

Title

Untitled (man with two girls at soda fountain)

People

Artist: John Deusing, American active 1940s

Date

c. 1940

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.1516

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Person 99.4
Human 99.4
Person 99.4
Clothing 99.2
Apparel 99.2
Person 96.2
Sunglasses 93.4
Accessories 93.4
Accessory 93.4
Robe 85
Fashion 85
Gown 81.3
Indoors 80.4
Face 80.4
Bridegroom 77.9
Wedding 77.9
People 75
Female 69.2
Wedding Gown 67.4
Photography 66.6
Photo 66.6
Portrait 66.3
Room 60.1
Meal 58.2
Food 58.2
Coat 56.4
Crowd 56.1
Overcoat 55.4
Suit 55.4
Bride 55

Clarifai
created on 2023-10-15

people 99.5
man 97.9
adult 97
woman 96.9
group 96.8
indoors 92.8
two 88.2
sit 88.1
veil 85.7
groom 84.7
leader 83.8
wedding 83.2
three 83.1
family 82.4
monochrome 82.1
couple 77.7
portrait 77.4
interaction 74.9
group together 74.8
wear 73.1

Imagga
created on 2021-12-14

sketch 99.4
drawing 81.2
representation 58
grunge 16.2
man 16.1
pattern 13.7
people 13.4
design 12.9
art 12.2
technology 11.9
silhouette 11.6
negative 10.8
person 10.7
retro 10.6
male 10.6
style 10.4
science 9.8
working 9.7
graphic 9.5
decoration 9.4
modern 9.1
business 9.1
paint 9.1
health 9
film 8.7
work 8.6
poster 8.5
human 8.2
glass 8.1
team 8.1
cartoon 8
water 8
medicine 7.9
color 7.8
finance 7.6
professional 7.6
doctor 7.5
clean 7.5
equipment 7.5
element 7.4
clip art 7.4
medical 7.1
architecture 7

Google
created on 2021-12-14

Microsoft
created on 2021-12-14

text 99.2
window 89.7
wedding dress 86.4
wedding 85.5
bride 76.6
human face 71.7
person 69.7
clothing 65.2
white goods 50.7

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 24-38
Gender Female, 65.5%
Happy 94.9%
Calm 4.3%
Sad 0.4%
Surprised 0.1%
Angry 0.1%
Fear 0.1%
Confused 0.1%
Disgusted 0%

AWS Rekognition

Age 24-38
Gender Female, 99%
Happy 89.5%
Sad 4.8%
Calm 3.5%
Fear 0.7%
Surprised 0.6%
Confused 0.4%
Angry 0.4%
Disgusted 0.1%

AWS Rekognition

Age 36-54
Gender Female, 52%
Calm 55.7%
Sad 27%
Happy 9.6%
Confused 3.2%
Disgusted 2.2%
Angry 1.2%
Surprised 0.7%
Fear 0.4%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.4%
Sunglasses 93.4%

Categories

Imagga

paintings art 99.8%

Text analysis

Amazon

5
17

Google

17
17