Human Generated Data

Title

Untitled (men drinking at bar)

Date

1947

People

Artist: John Howell, American active 1930s-1960s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.21706

Human Generated Data

Title

Untitled (men drinking at bar)

People

Artist: John Howell, American active 1930s-1960s

Date

1947

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.21706

Machine Generated Data

Tags

Amazon
created on 2022-03-11

Person 99.6
Human 99.6
Person 99.5
Person 99.4
Person 99.3
Person 99.3
Person 98.7
Person 97.4
Person 97.2
Meal 75.8
Food 75.8
Suit 73.9
Coat 73.9
Clothing 73.9
Overcoat 73.9
Apparel 73.9
Face 73
Crowd 68.7
People 65.4
Shelf 62.7
Room 58
Indoors 58
Sitting 56.9
Cafeteria 56.3
Restaurant 56.3

Clarifai
created on 2023-10-22

people 99.8
group 99
group together 98.6
adult 97
man 96.5
woman 96
indoors 95.6
child 94.1
furniture 94
many 93.5
room 92.7
monochrome 90
five 88.7
school 88.4
education 88.1
several 87.6
employee 86.9
boy 85.9
recreation 85.7
four 83.4

Imagga
created on 2022-03-11

marimba 100
percussion instrument 100
musical instrument 100
people 25.1
man 21.5
male 18.4
happy 15.6
adult 14.9
old 14.6
lifestyle 14.4
couple 13.9
smiling 13.7
sitting 13.7
women 13.4
architecture 12.5
men 12
person 11.8
city 11.6
indoors 11.4
love 11
together 10.5
vacation 9.8
business 9.7
restaurant 9.5
travel 9.1
leisure 9.1
holding 9.1
portrait 9.1
landmark 9
drinking 8.6
glass 8.5
adults 8.5
friends 8.4
senior 8.4
pretty 8.4
famous 8.4
tourism 8.2
outdoors 8.2
cheerful 8.1
group 8.1
water 8
home 8
interior 8
building 7.9
happiness 7.8
table 7.8
finance 7.6
hand 7.6
house 7.5
drink 7.5
enjoyment 7.5
aged 7.2
office 7.2
black 7.2
holiday 7.2

Google
created on 2022-03-11

Photograph 94.2
Snapshot 74.3
Engineering 71.5
Event 71
Room 69.7
Monochrome 69.6
Monochrome photography 68.9
Science 65.7
T-shirt 63.7
Crew 62.4
History 59.4
Machine 57.9
Team 56.7
Table 56.2
Metal 55.5
Suit 54
Art 53.7
Window 51.4
Font 50.6

Microsoft
created on 2022-03-11

person 98.6
text 98.2
man 92.5
clothing 87
black and white 84.1
table 68.2
posing 46

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 50-58
Gender Male, 99.3%
Happy 80%
Sad 8.5%
Calm 4.4%
Confused 3.3%
Surprised 1.2%
Disgusted 1.1%
Angry 0.8%
Fear 0.7%

AWS Rekognition

Age 48-56
Gender Male, 99.8%
Confused 73.7%
Happy 13.8%
Surprised 5.2%
Calm 2.4%
Disgusted 1.6%
Sad 1.6%
Fear 0.8%
Angry 0.8%

AWS Rekognition

Age 47-53
Gender Male, 100%
Happy 94.8%
Confused 1.6%
Disgusted 0.8%
Calm 0.7%
Sad 0.6%
Surprised 0.5%
Angry 0.5%
Fear 0.4%

AWS Rekognition

Age 38-46
Gender Male, 99%
Surprised 53%
Happy 40.1%
Sad 3%
Confused 0.9%
Fear 0.9%
Disgusted 0.8%
Calm 0.8%
Angry 0.4%

AWS Rekognition

Age 39-47
Gender Male, 97.1%
Confused 31.4%
Happy 25.9%
Calm 18.4%
Surprised 10.4%
Sad 7.2%
Fear 3.5%
Disgusted 2.4%
Angry 0.8%

AWS Rekognition

Age 45-51
Gender Male, 100%
Calm 36.8%
Sad 26.3%
Happy 10.9%
Surprised 10.4%
Angry 6.5%
Confused 4.5%
Disgusted 2.5%
Fear 2.1%

AWS Rekognition

Age 48-54
Gender Male, 98.4%
Happy 69%
Surprised 17.5%
Confused 8%
Sad 1.6%
Disgusted 1.2%
Calm 0.9%
Fear 0.9%
Angry 0.8%

AWS Rekognition

Age 29-39
Gender Male, 72.6%
Happy 61.9%
Sad 21.2%
Surprised 11.2%
Angry 2.7%
Fear 1.1%
Disgusted 0.8%
Confused 0.7%
Calm 0.6%

AWS Rekognition

Age 52-60
Gender Male, 81.2%
Calm 94%
Sad 4.1%
Confused 1%
Surprised 0.3%
Happy 0.3%
Disgusted 0.2%
Angry 0.2%
Fear 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Possible
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Suit
Person 99.6%
Person 99.5%
Person 99.4%
Person 99.3%
Person 99.3%
Person 98.7%
Person 97.4%
Person 97.2%
Suit 73.9%

Categories

Text analysis

Amazon

as
i
1
& i 1
&

Google

YT37A2-XAG
YT37A2-XAG