Human Generated Data

Title

Untitled (dining room, Columbia Prep School, young men)

Date

c.1940

People

Artist: Harris & Ewing, American 1910s-1940s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.22255

Human Generated Data

Title

Untitled (dining room, Columbia Prep School, young men)

People

Artist: Harris & Ewing, American 1910s-1940s

Date

c.1940

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-11

Person 99.2
Human 99.2
Person 99.1
Person 98.9
Restaurant 98.8
Person 97
Person 96.4
Indoors 93.6
Room 93.6
Person 93.1
Person 93.1
Person 92.4
Person 91.6
Cafeteria 91.4
Meal 87.3
Food 87.3
Person 86.9
School 86.7
Classroom 86.7
Person 83.2
Interior Design 81.7
Person 81.6
Person 80.7
Person 74.6
Person 72
Dish 69.8
Crowd 69.5
People 65.9
Person 64.7
Furniture 59.6
Food Court 59
Cafe 58.5
Bowl 57.2
Audience 56.5
Workshop 55.3
Person 55

Imagga
created on 2022-03-11

table 52.8
interior 46.9
restaurant 46.3
chair 40.9
room 38.5
banquet 37.2
dinner 32.1
glass 30.9
furniture 27.8
decor 25.6
dining 24.7
luxury 23.1
indoors 22
modern 21.7
design 20.2
setting 20.2
hall 20
lunch 19.7
food 18.5
architecture 18
style 17.8
floor 17.7
chairs 17.6
drink 17.5
wood 17.5
meal 16.8
home 16.7
house 16.7
party 16.3
cafeteria 16.2
building 16.1
decoration 16
wine 15.9
salon 15.9
service 15.7
seat 15.4
eat 15.1
tables 14.8
inside 14.7
elegant 14.6
window 13.9
empty 13.7
comfortable 13.4
hotel 13.4
wedding 12.9
reception 12.7
catering 12.7
cutlery 12.7
structure 12.4
nobody 12.4
contemporary 12.2
bar 12
indoor 11.9
percussion instrument 11.8
napkin 11.7
fork 11.5
musical instrument 11.4
place 11.2
event 11.1
silverware 11
elegance 10.9
traditional 10.8
knife 10.6
residential 10.5
plate 10.2
formal 9.5
alcohol 9.5
light 9.4
steel drum 9.2
business 9.1
life 9
dine 8.8
celebration 8.8
urban 8.7
decorate 8.6
estate 8.5
kitchen 8.2
dish 8.1
classroom 8
supper 7.9
stool 7.9
upscale 7.9
tablecloth 7.8
people 7.8
serve 7.8
scene 7.8
fancy 7.7
set 7.6
lamp 7.6
real 7.6
living 7.6
horizontal 7.5
relaxation 7.5

Google
created on 2022-03-11

White 92.2
Black 89.9
Black-and-white 85.1
Style 83.9
Monochrome 76.8
Monochrome photography 76.6
Event 73.6
Room 71.6
Table 68.2
Crowd 67.5
Chair 67.1
History 66.3
Team 62.9
Vintage clothing 62.4
Photo caption 61.7
Stock photography 61.6
Hat 61.1
Art 60.9
Suit 58.6
Class 54.6

Microsoft
created on 2022-03-11

person 96.5
house 90.7
text 85.9
black and white 85.8
clothing 80.1
man 77.4
table 72.6
group 71
furniture 66.6
people 57.6
crowd 1

Face analysis

Amazon

Google

AWS Rekognition

Age 20-28
Gender Male, 96.8%
Calm 50%
Confused 16%
Sad 12.3%
Happy 10.4%
Disgusted 5.1%
Fear 3.2%
Surprised 2%
Angry 1%

AWS Rekognition

Age 41-49
Gender Male, 71.5%
Calm 100%
Confused 0%
Surprised 0%
Happy 0%
Sad 0%
Disgusted 0%
Angry 0%
Fear 0%

AWS Rekognition

Age 13-21
Gender Female, 96.2%
Sad 68.7%
Calm 28.5%
Confused 1.2%
Angry 0.6%
Happy 0.4%
Surprised 0.3%
Disgusted 0.2%
Fear 0.2%

AWS Rekognition

Age 38-46
Gender Male, 99.1%
Calm 67.7%
Sad 14.6%
Happy 6.6%
Confused 5.5%
Surprised 2.7%
Angry 1.5%
Disgusted 1%
Fear 0.5%

AWS Rekognition

Age 26-36
Gender Male, 68.2%
Calm 94.9%
Happy 2.6%
Sad 1.1%
Disgusted 0.4%
Confused 0.3%
Surprised 0.3%
Fear 0.3%
Angry 0.1%

AWS Rekognition

Age 16-22
Gender Female, 85%
Calm 99%
Sad 0.4%
Happy 0.3%
Surprised 0.1%
Disgusted 0.1%
Confused 0.1%
Angry 0.1%
Fear 0%

AWS Rekognition

Age 30-40
Gender Male, 76.1%
Confused 33.8%
Sad 26.6%
Disgusted 16.2%
Calm 14.2%
Happy 5.1%
Angry 2.3%
Fear 0.9%
Surprised 0.9%

AWS Rekognition

Age 23-33
Gender Female, 55%
Sad 98.9%
Fear 0.3%
Happy 0.2%
Calm 0.2%
Disgusted 0.1%
Angry 0.1%
Surprised 0.1%
Confused 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.2%

Captions

Microsoft

a group of people sitting at a train station 71.3%
a group of people in a room 71.2%
a group of people standing in front of a window 71.1%

Text analysis

Amazon

KODVK-2VLE1A