Human Generated Data

Title

Untitled (diners seated at long tables with Happy Day Soap Powder)

Date

c. 1930

People

Artist: Hamblin Studio, American active 1930s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.2080

Human Generated Data

Title

Untitled (diners seated at long tables with Happy Day Soap Powder)

People

Artist: Hamblin Studio, American active 1930s

Date

c. 1930

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.2080

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Person 99.5
Human 99.5
Person 99.4
Person 98.3
Person 98.3
Person 96.2
Person 95.3
Person 95
Clothing 94.9
Apparel 94.9
Person 94.7
Person 92.6
Person 90.8
Person 88.8
Building 88.2
Person 87.7
Person 84.5
Person 83.4
Face 82.5
Person 79.6
People 78
Crowd 77
Meal 70
Food 70
Person 68.8
Indoors 68.3
Urban 66.5
Person 65.9
Photography 62.4
Photo 62.4
Person 62.4
Gown 62.3
Fashion 62.3
Room 61.1
Dinosaur 60.3
Animal 60.3
Reptile 60.3
Table 59.3
Furniture 59.3
Robe 59
Person 58.9
Female 57.6
Factory 55.4
Person 43.8

Clarifai
created on 2023-10-15

people 99.8
many 99.7
group 99.5
adult 97.7
man 96.4
crowd 95.9
group together 95.3
leader 94.4
veil 94.2
woman 91.8
wear 90.1
administration 85.7
audience 85.4
illustration 82.6
religion 81.3
war 81.3
military 76.4
child 74.3
education 73.7
ceremony 69.7

Imagga
created on 2021-12-14

harp 25.9
device 20.8
support 20.1
group 18.5
crowd 17.3
building 14.6
people 14.5
musical instrument 14.2
man 13.4
art 13.1
stringed instrument 13
business 12.7
black 12.6
old 12.5
architecture 11.9
design 11.8
stone 11.8
men 11.2
symbol 10.8
silhouette 10.8
world 10.2
sitar 10
landscape 9.7
sky 9.6
scene 9.5
party 9.5
person 9.4
industry 9.4
film 9.3
shower curtain 9.3
male 9.2
travel 9.1
city 9.1
hand 9.1
global 9.1
negative 9
music 9
businessman 8.8
spectator 8.5
house 8.4
structure 8.3
curtain 8.3
tourism 8.2
water 8
women 7.9
work 7.8
rock 7.8
concert 7.8
blind 7.6
roof 7.6
finance 7.6
park 7.3
industrial 7.3
day 7.1

Google
created on 2021-12-14

Microsoft
created on 2021-12-14

text 96.8
person 91
clothing 89.9
man 74.1
group 65
old 64
woman 58.3
black and white 52.2

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 25-39
Gender Female, 55.3%
Sad 46.7%
Angry 30.3%
Calm 10.8%
Confused 5.6%
Surprised 2.6%
Happy 2%
Fear 1.2%
Disgusted 0.7%

AWS Rekognition

Age 32-48
Gender Male, 89.1%
Sad 63.1%
Calm 25.9%
Happy 10.6%
Confused 0.3%
Angry 0.1%
Surprised 0%
Disgusted 0%
Fear 0%

AWS Rekognition

Age 36-52
Gender Male, 90.2%
Calm 80.5%
Happy 18.6%
Sad 0.5%
Confused 0.2%
Angry 0.1%
Surprised 0.1%
Disgusted 0.1%
Fear 0%

AWS Rekognition

Age 26-40
Gender Female, 70.2%
Calm 97.4%
Happy 1.7%
Sad 0.6%
Angry 0.1%
Confused 0.1%
Surprised 0.1%
Disgusted 0%
Fear 0%

AWS Rekognition

Age 27-43
Gender Male, 73.5%
Happy 57.2%
Calm 33.6%
Sad 5.7%
Confused 2.1%
Angry 0.7%
Surprised 0.4%
Disgusted 0.2%
Fear 0.1%

AWS Rekognition

Age 27-43
Gender Male, 96.2%
Calm 81.3%
Sad 12.1%
Happy 4.5%
Angry 1.2%
Confused 0.4%
Surprised 0.2%
Disgusted 0.1%
Fear 0.1%

AWS Rekognition

Age 20-32
Gender Male, 66.6%
Sad 53.9%
Calm 42.7%
Happy 2.5%
Confused 0.4%
Angry 0.2%
Fear 0.1%
Surprised 0.1%
Disgusted 0.1%

AWS Rekognition

Age 41-59
Gender Female, 64.4%
Calm 70.9%
Happy 18.2%
Sad 9.2%
Confused 0.9%
Angry 0.5%
Surprised 0.2%
Fear 0.1%
Disgusted 0.1%

AWS Rekognition

Age 22-34
Gender Male, 84.6%
Calm 65.5%
Happy 16%
Surprised 9.5%
Sad 3.3%
Confused 2.4%
Angry 1.8%
Fear 1.1%
Disgusted 0.5%

AWS Rekognition

Age 7-17
Gender Female, 94.8%
Happy 35.9%
Calm 23.1%
Confused 18.8%
Sad 7.9%
Surprised 6.7%
Angry 4.2%
Fear 1.8%
Disgusted 1.6%

AWS Rekognition

Age 19-31
Gender Female, 63.3%
Calm 66.2%
Happy 21.7%
Sad 8.2%
Confused 1.8%
Angry 1.4%
Surprised 0.3%
Disgusted 0.2%
Fear 0.2%

AWS Rekognition

Age 16-28
Gender Male, 69.4%
Happy 45.4%
Calm 42%
Sad 8.8%
Confused 1.4%
Angry 1.2%
Surprised 0.7%
Fear 0.4%
Disgusted 0.2%

AWS Rekognition

Age 25-39
Gender Female, 87%
Calm 49.3%
Sad 45.1%
Happy 3.3%
Angry 1.2%
Confused 0.3%
Fear 0.3%
Surprised 0.3%
Disgusted 0.1%

AWS Rekognition

Age 22-34
Gender Female, 54.1%
Happy 81.3%
Calm 14.5%
Sad 2%
Surprised 0.9%
Angry 0.6%
Confused 0.6%
Fear 0.1%
Disgusted 0%

AWS Rekognition

Age 18-30
Gender Female, 88.1%
Sad 48.2%
Calm 43.4%
Fear 5.3%
Confused 1.1%
Happy 0.8%
Disgusted 0.5%
Angry 0.4%
Surprised 0.3%

AWS Rekognition

Age 23-35
Gender Male, 75.7%
Sad 74.6%
Calm 12.5%
Happy 11.1%
Angry 0.5%
Fear 0.5%
Confused 0.5%
Surprised 0.2%
Disgusted 0.1%

AWS Rekognition

Age 39-57
Gender Female, 68.3%
Calm 52.1%
Happy 37.9%
Sad 7.1%
Angry 1.6%
Confused 0.8%
Surprised 0.2%
Disgusted 0.2%
Fear 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.5%
Dinosaur 60.3%

Categories

Text analysis

Amazon

VMI
SOOWH

Google

VMI
VMI