Human Generated Data

Title

Untitled (women at table at party)

Date

1964

People

Artist: Robert Burian, American active 1940s-1950s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19199

Human Generated Data

Title

Untitled (women at table at party)

People

Artist: Robert Burian, American active 1940s-1950s

Date

1964

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19199

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Chair 99.9
Furniture 99.9
Chair 99.9
Person 99.1
Human 99.1
Sitting 98
Restaurant 97.7
Person 97
Person 96.7
Chair 96.5
Person 94.4
Cafeteria 83.8
Cafe 82.7
Female 78.6
Chair 77.3
Person 74.6
Blonde 67.7
Teen 67.7
Kid 67.7
Child 67.7
Girl 67.7
Woman 67.7
Dating 61.5
Meal 58.3
Food 58.3
Flooring 55.3

Clarifai
created on 2023-10-22

people 99.7
furniture 98.4
adult 97.9
woman 97.2
sit 96.9
man 96.3
room 96.1
seat 94.7
chair 94.3
table 94.2
group 94.2
monochrome 94
indoors 92.3
group together 91.9
two 91.7
three 85.6
dining room 85.6
four 85.5
recreation 85.2
home 83.5

Imagga
created on 2022-03-05

man 29.6
chair 29.5
people 26.8
office 25.1
indoors 21.1
adult 21
room 20.5
business 20
male 19.8
person 19.1
women 19
computer 18.6
work 17.8
sitting 17.2
men 17.2
laptop 16.7
lifestyle 15.2
happy 15
interior 15
businessman 15
table 14.9
seat 14.7
building 14.5
smiling 13.7
home 13.6
meeting 13.2
group 12.9
barbershop 12.8
modern 12.6
communication 12.6
couple 12.2
two 11.9
newspaper 11.7
furniture 11.7
team 11.6
job 11.5
working 11.5
corporate 11.2
classroom 10.5
together 10.5
shop 10.5
professional 10.4
friends 10.3
senior 10.3
luxury 10.3
smile 10
chairs 9.8
product 9.7
desk 9.7
success 9.6
teamwork 9.3
travel 9.1
businesswoman 9.1
transportation 9
outdoors 9
cheerful 8.9
education 8.7
window 8.5
casual 8.5
equipment 8.4
spectator 8.4
relax 8.4
attractive 8.4
company 8.4
mercantile establishment 8.3
back 8.3
conference 7.8
architecture 7.8
executive 7.8
concentration 7.7
outside 7.7
teacher 7.6
creation 7.6
structure 7.5
mature 7.4
worker 7.4
phone 7.4
lady 7.3
student 7.2
barber chair 7.2
passenger 7.2
love 7.1
happiness 7

Google
created on 2022-03-05

Furniture 94.9
Chair 91.4
Black 89.7
Black-and-white 85.7
Window 84.6
Style 84
Monochrome 77.6
Monochrome photography 76.2
Room 70.2
Event 69.2
Sitting 65.4
Stock photography 63.9
Conversation 61.5
Street 60.6
Font 59.8
Table 57
City 52.9

Microsoft
created on 2022-03-05

furniture 97.8
text 97.3
chair 97.2
clothing 93
person 91.9
black and white 89.6
woman 84.6
man 61.4
table 27.3
dining table 14.4

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 27-37
Gender Male, 71.4%
Calm 81.2%
Confused 6.6%
Surprised 5.4%
Fear 4.4%
Sad 0.8%
Angry 0.6%
Happy 0.5%
Disgusted 0.4%

AWS Rekognition

Age 51-59
Gender Female, 98%
Calm 86.3%
Sad 12.3%
Happy 0.6%
Fear 0.2%
Disgusted 0.2%
Surprised 0.2%
Confused 0.2%
Angry 0.1%

AWS Rekognition

Age 29-39
Gender Female, 51.3%
Calm 99.6%
Confused 0.1%
Surprised 0.1%
Happy 0.1%
Fear 0%
Angry 0%
Sad 0%
Disgusted 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Chair
Person
Chair 99.9%
Chair 99.9%
Chair 96.5%
Chair 77.3%
Person 99.1%
Person 97%
Person 96.7%
Person 94.4%
Person 74.6%

Text analysis

Amazon

6
F
MJI7
MAGOM
YT37A2
MAQOX
MJIR YT37A2
MJIR

Google

MJII Y T37 A2 CAGOX MJL3 Y T33 A2 MAGOM
MJII
Y
T37
A2
CAGOX
MJL3
T33
MAGOM