Human Generated Data

Title

Untitled (group sitting at dinner table)

Date

1948

People

Artist: Orrion Barger, American active 1913 - 1984

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6218

Human Generated Data

Title

Untitled (group sitting at dinner table)

People

Artist: Orrion Barger, American active 1913 - 1984

Date

1948

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6218

Machine Generated Data

Tags

Amazon
created on 2022-01-22

Chair 99.6
Furniture 99.6
Sitting 99.5
Human 99.5
Person 99.2
Person 99.2
Person 99
Tabletop 97.6
Person 97.6
Chair 96.5
Restaurant 96.4
Person 93.9
Table 89.7
Meal 86
Food 86
Person 81.7
Dining Table 81.6
Apparel 79.9
Clothing 79.9
Dish 78.6
Sunglasses 76.1
Accessory 76.1
Accessories 76.1
Glass 73.1
Person 71.9
Overcoat 71
Coat 71
Cafeteria 63.3
Cafe 63.2
People 62.6
Bar Counter 62.6
Pub 62.6
Person 61.4
Crowd 60.7
Room 59.4
Indoors 59.4
Couch 57.3
Face 56
Food Court 55.2
Suit 52.2
Person 45.5

Clarifai
created on 2023-10-26

people 100
group 99.1
leader 99.1
adult 97.9
furniture 97.9
group together 97.7
administration 97.7
chair 97.2
man 96.6
room 93.6
many 93.6
sit 93.5
home 93.4
woman 89.5
military 87.6
meeting 85.7
education 84.9
elderly 84.5
several 84.4
league 82

Imagga
created on 2022-01-22

room 36
classroom 29.9
man 28.9
people 28.4
male 25.5
person 25.1
couple 20.9
chair 19.8
sitting 19.7
indoors 18.4
table 18.2
adult 17.9
musical instrument 17.2
senior 16.9
old 16
lifestyle 15.2
men 14.6
mature 13.9
home 13.5
love 13.4
teacher 13.4
business 13.4
happy 13.1
office 12.3
percussion instrument 11.7
sax 11.4
together 11.4
group 11.3
restaurant 11.2
women 11.1
portrait 11
relaxing 10.9
older 10.7
interior 10.6
businessman 10.6
cheerful 10.6
architecture 10.1
aged 9.9
building 9.7
life 9.6
retirement 9.6
looking 9.6
drinking 9.6
scene 9.5
smiling 9.4
happiness 9.4
hall 9.3
smile 9.3
spectator 9.3
wine 9.2
relaxation 9.2
travel 9.1
waiter 9
executive 8.9
wind instrument 8.9
computer 8.8
seat 8.8
education 8.7
stringed instrument 8.6
desk 8.5
casual 8.5
two 8.5
black 8.4
worker 8.4
suit 8.3
indoor 8.2
work 8.2
dress 8.1
piano 8.1
marimba 8
glass 7.8
retired 7.8
class 7.7
elderly 7.7
student 7.6
husband 7.6
businesspeople 7.6
meeting 7.5
leisure 7.5
glasses 7.4
inside 7.4
school 7.3
history 7.2
romance 7.1
professional 7
modern 7

Google
created on 2022-01-22

Furniture 93.9
Black 89.7
Chair 89.1
Table 86.1
Art 74.7
Snapshot 74.3
Monochrome 73.2
Monochrome photography 72.8
Vintage clothing 71.7
Event 70.7
Classic 69.1
History 65.7
Sitting 65.4
Room 65.3
Stock photography 64.8
Recreation 64.5
Suit 60.8
Conversation 56.7
Illustration 53.1

Microsoft
created on 2022-01-22

text 99
table 92.6
person 89.4
music 86.6
furniture 79
clothing 76.2
man 69.2
chair 69.1
piano 58.2
black and white 56.4
old 44.8

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 28-38
Gender Male, 98.7%
Happy 97.8%
Surprised 0.7%
Confused 0.6%
Sad 0.2%
Disgusted 0.2%
Angry 0.2%
Fear 0.1%
Calm 0.1%

AWS Rekognition

Age 38-46
Gender Male, 98.2%
Calm 77.4%
Sad 10.5%
Confused 9.3%
Surprised 0.8%
Disgusted 0.7%
Happy 0.6%
Angry 0.4%
Fear 0.3%

AWS Rekognition

Age 38-46
Gender Male, 99.4%
Surprised 36.9%
Happy 30.3%
Calm 19.4%
Confused 5.5%
Sad 4.2%
Disgusted 1.6%
Angry 1.1%
Fear 1%

AWS Rekognition

Age 40-48
Gender Female, 86.8%
Happy 99%
Calm 0.5%
Surprised 0.3%
Confused 0.1%
Disgusted 0.1%
Sad 0%
Angry 0%
Fear 0%

AWS Rekognition

Age 47-53
Gender Male, 99.8%
Sad 37.4%
Happy 17.8%
Calm 12.6%
Confused 8.8%
Angry 7.2%
Surprised 7.1%
Disgusted 4.6%
Fear 4.5%

AWS Rekognition

Age 37-45
Gender Male, 56.5%
Calm 84.6%
Sad 6.7%
Confused 4.3%
Happy 2%
Fear 0.8%
Disgusted 0.6%
Surprised 0.5%
Angry 0.4%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Likely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Chair 99.6%
Person 99.2%
Sunglasses 76.1%
Suit 52.2%

Categories

Text analysis

Amazon

EES
KODVK-SVLELA

Google

YT3FA8-XAO
YT3FA8-XAO