Human Generated Data

Title

Untitled (young men getting dressed for ball)

Date

c. 1966

People

Artist: Robert Burian, American active 1940s-1950s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19244

Human Generated Data

Title

Untitled (young men getting dressed for ball)

People

Artist: Robert Burian, American active 1940s-1950s

Date

c. 1966

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-25

Person 99.2
Human 99.2
Person 98.8
Apparel 93.8
Clothing 93.8
Home Decor 80.2
Door 69.7
Chair 64.8
Furniture 64.8
Shirt 60.7
Shorts 59.3
Indoors 58.5
Room 58.5
Pants 58.1
Floor 57.2

Imagga
created on 2022-02-25

person 37.3
man 34.3
male 30.7
adult 30.7
people 27.9
professional 25.5
business 23.1
businessman 21.2
men 20.6
portrait 19.4
office 17.9
standing 17.4
looking 16.8
smiling 15.9
indoors 15.8
smile 15.7
happy 15.7
corporate 15.5
lifestyle 14.5
fashion 14.3
women 14.2
pretty 14
employee 13.9
black 13.8
worker 13.7
casual 13.6
attractive 13.3
job 13.3
lady 13
room 12.2
executive 12.2
suit 12.2
modern 11.9
teacher 11.9
indoor 11.9
businesswoman 11.8
happiness 11.8
model 11.7
building 11.4
life 11.3
two 11
nurse 10.8
waiter 10.8
cheerful 10.6
couple 10.5
one 10.5
door 10.1
alone 10
dress 9.9
home 9.6
businesspeople 9.5
work 9.4
window 9.4
inside 9.2
student 9.2
handsome 8.9
style 8.9
family 8.9
interior 8.8
shop 8.7
dining-room attendant 8.6
confidence 8.6
elegant 8.6
career 8.5
communication 8.4
holding 8.3
human 8.2
guy 8.2
confident 8.2
clothing 8
sexy 8
device 8
working 8
hands 7.8
1 7.7
wall 7.7
formal 7.6
desk 7.6
elegance 7.6
house 7.5
jacket 7.5
light 7.4
occupation 7.3
color 7.2
team 7.2
bright 7.1
coat 7
together 7

Google
created on 2022-02-25

Microsoft
created on 2022-02-25

text 99.8
person 99.1
wall 99
clothing 98
man 96
standing 78.6
posing 59.1

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 18-26
Gender Male, 100%
Surprised 56.5%
Calm 28.9%
Confused 10.3%
Sad 2%
Angry 1%
Disgusted 0.7%
Fear 0.5%
Happy 0.1%

AWS Rekognition

Age 24-34
Gender Male, 99.9%
Calm 96.7%
Confused 3%
Disgusted 0.1%
Sad 0.1%
Surprised 0.1%
Angry 0.1%
Fear 0%
Happy 0%

Microsoft Cognitive Services

Age 34
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.2%

Captions

Microsoft

a man standing in front of a mirror posing for the camera 82.4%
a group of people standing in front of a mirror posing for the camera 82.3%
a man and woman standing in front of a mirror posing for the camera 67.1%

Text analysis

Amazon

173
133

Google

173 13
13
173