Human Generated Data

Title

Untitled (people at table at party)

Date

1964

People

Artist: Robert Burian, American active 1940s-1950s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19207

Human Generated Data

Title

Untitled (people at table at party)

People

Artist: Robert Burian, American active 1940s-1950s

Date

1964

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19207

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Chair 99.2
Furniture 99.2
Person 99
Human 99
Person 98.6
Person 97
Person 96.9
Restaurant 96.1
Person 95.6
Tabletop 95.5
Person 95.2
Person 95.2
Table 95
Person 94.7
Person 94.6
Meal 93.6
Food 93.6
Dining Table 92.5
Person 91.1
Tablecloth 86.7
Crowd 85.7
Dish 85.7
Suit 84.2
Coat 84.2
Overcoat 84.2
Clothing 84.2
Apparel 84.2
Glass 69.4
Party 69.3
Person 69
Beverage 67.5
Drink 67.5
Cafeteria 64.9
Indoors 64.6
People 63.5
Room 61.7
Bar Counter 61.6
Pub 61.6
Alcohol 59.6
Dining Room 58
Home Decor 57.4
Audience 56
Linen 55.8
Chair 52.8
Person 51

Clarifai
created on 2023-10-22

people 99.9
adult 99
group 98.9
group together 98.5
man 98.2
woman 96.7
many 96.5
administration 92
leader 90.9
indoors 90.3
sit 90
furniture 89.6
meeting 88.6
chair 86.9
military 86
child 84.1
recreation 83.8
room 83.6
sitting 83.4
monochrome 83.4

Imagga
created on 2022-03-05

restaurant 32.3
table 30.6
people 26.2
person 24.6
chair 23.2
man 22.8
waiter 21.7
business 21.2
male 20.6
musical instrument 19.9
percussion instrument 18.7
indoors 18.4
room 17.4
sitting 17.2
meeting 16.9
group 16.9
interior 16.8
dinner 16.7
meal 15.9
food 15.9
happy 15.6
businessman 15
steel drum 15
together 14.9
wine 14.8
employee 14.5
worker 14
couple 13.9
dining-room attendant 13.4
office 13.4
drink 13.4
dining 13.3
glass 13.2
smiling 13
party 12.9
home 12.7
businesswoman 12.7
work 11.9
holding 11.5
desk 11.3
men 11.2
teamwork 11.1
women 11.1
communication 10.9
lifestyle 10.8
handsome 10.7
hall 10.6
adult 10.6
job 10.6
corporate 10.3
inside 10.1
team 9.8
lunch 9.8
modern 9.8
working 9.7
eating 9.2
furniture 9.1
suit 9
success 8.8
celebration 8.8
teacher 8.8
boy 8.7
barroom 8.6
businesspeople 8.5
plate 8.5
friends 8.4
house 8.3
service 8.3
classroom 8.2
building 8.2
entrepreneur 8.2
laptop 8.2
kitchen 8.2
salon 8.1
board 7.9
equipment 7.8
conference 7.8
formal 7.6
workplace 7.6
talking 7.6
hand 7.6
executive 7.5
floor 7.4
coffee 7.4
indoor 7.3
confident 7.3
looking 7.2
life 7.2
smile 7.1
structure 7.1
happiness 7

Google
created on 2022-03-05

Photograph 94.2
Black 89.6
Table 87.1
Black-and-white 86
Chair 84.1
Style 84
Font 78.1
Monochrome photography 76.5
Monochrome 75.5
Snapshot 74.3
Suit 74.2
Event 73.5
Room 66
Art 64.8
Stock photography 64.8
Classic 64.8
Formal wear 64.3
History 63.9
Tablecloth 62.9
Tableware 61.8

Microsoft
created on 2022-03-05

person 98.9
text 97.3
wall 96.1
clothing 91.8
indoor 90.4
table 89.6
man 85.7
black and white 75.6
people 70.8
group 56.6
old 40.4
crowd 0.6

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 50-58
Gender Male, 74.2%
Calm 30.9%
Sad 27.4%
Surprised 16%
Confused 13.5%
Happy 5.5%
Disgusted 3.4%
Angry 1.9%
Fear 1.3%

AWS Rekognition

Age 51-59
Gender Male, 98.2%
Sad 59.5%
Calm 35.6%
Happy 2.6%
Confused 1%
Disgusted 0.6%
Angry 0.4%
Surprised 0.2%
Fear 0.1%

AWS Rekognition

Age 33-41
Gender Male, 77.1%
Calm 96.5%
Sad 2.5%
Happy 0.4%
Angry 0.1%
Confused 0.1%
Disgusted 0.1%
Surprised 0.1%
Fear 0%

AWS Rekognition

Age 35-43
Gender Female, 77.6%
Calm 64.9%
Happy 26%
Sad 2.5%
Surprised 2.2%
Confused 2%
Fear 1.1%
Disgusted 0.8%
Angry 0.6%

AWS Rekognition

Age 23-31
Gender Female, 56.9%
Calm 96.7%
Happy 2.1%
Sad 0.8%
Surprised 0.1%
Confused 0.1%
Disgusted 0.1%
Angry 0.1%
Fear 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Chair
Person
Chair 99.2%
Chair 52.8%
Person 99%
Person 98.6%
Person 97%
Person 96.9%
Person 95.6%
Person 95.2%
Person 95.2%
Person 94.7%
Person 94.6%
Person 91.1%
Person 69%
Person 51%

Text analysis

Amazon

12
H
MJIR
MJIR YT37A2
MAGOM
MAGOX
YT37A2
YT37A°C
MJIA YT37A°C
MJIA

Google

H 12 MJIa ÝT33 A2 MAGON MJI Y T 33 A2 AAGOX
H
12
MJIa
ÝT33
A2
MAGON
MJI
Y
T
33
AAGOX