Human Generated Data

Title

Untitled (three men at party)

Date

1965

People

Artist: Robert Burian, American active 1940s-1950s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19332

Human Generated Data

Title

Untitled (three men at party)

People

Artist: Robert Burian, American active 1940s-1950s

Date

1965

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-25

Person 96.6
Human 96.6
Apparel 93.6
Clothing 93.6
Tie 93.1
Accessories 93.1
Accessory 93.1
Person 89.3
Overcoat 88.4
Suit 88.4
Coat 88.4
Person 86.8
Door 80.3
Face 76.7
Suit 74.6
Tuxedo 74.4
Elevator 65.6
Suit 64.4
Performer 60.1
Photo Booth 59
Display 55.4
Monitor 55.4
Electronics 55.4
Screen 55.4

Imagga
created on 2022-02-25

business 41.3
office 36.4
adult 35.3
man 32.3
corporate 31.8
professional 31.1
people 30.7
groom 30.4
person 30.1
businessman 28.3
passenger 27.1
businesswoman 25.5
male 24.8
work 22.8
happy 22.6
executive 22.3
smile 21.4
attractive 21
job 20.4
building 20.1
businesspeople 17.1
looking 16.8
suit 16.8
smiling 16.6
career 16.1
meeting 16
worker 16
communication 16
company 15.8
20s 15.6
working 15
portrait 14.9
pretty 14.7
sitting 14.6
laptop 14.6
employee 14.3
happiness 14.1
phone 13.8
successful 13.7
confident 13.6
computer 13.6
team 13.4
men 12.9
success 12.9
student 12.8
women 12.7
modern 12.6
handsome 12.5
black 12
lifestyle 11.6
indoors 11.4
talking 11.4
fashion 11.3
manager 11.2
holding 10.7
lady 10.6
confidence 10.5
formal 10.5
group 10.5
notebook 10.3
car 10.3
mature 10.2
cheerful 9.8
technology 9.6
boss 9.6
education 9.5
one person 9.4
cute 9.3
teamwork 9.3
occupation 9.2
alone 9.1
indoor 9.1
clothing 8.9
shop 8.7
couple 8.7
employment 8.7
workplace 8.6
smart 8.5
finance 8.5
one 8.2
family 8
day 7.8
conversation 7.8
colleagues 7.8
casual 7.6
ethnic 7.6
desk 7.6
inside 7.4
device 7.3
vehicle 7.3
telephone 7.1

Google
created on 2022-02-25

Microsoft
created on 2022-02-25

person 94.7
clothing 93.7
man 90
suit 76
text 70
human face 69.9
smile 69.3
black and white 69.1

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 22-30
Gender Male, 98.2%
Happy 99.2%
Confused 0.4%
Surprised 0.2%
Angry 0.1%
Disgusted 0%
Fear 0%
Calm 0%
Sad 0%

AWS Rekognition

Age 52-60
Gender Male, 100%
Happy 99.4%
Surprised 0.2%
Calm 0.1%
Disgusted 0.1%
Fear 0.1%
Angry 0%
Confused 0%
Sad 0%

AWS Rekognition

Age 54-64
Gender Male, 99.8%
Calm 53%
Confused 44%
Sad 1.6%
Surprised 0.5%
Fear 0.3%
Angry 0.3%
Disgusted 0.2%
Happy 0.1%

Microsoft Cognitive Services

Age 31
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 96.6%
Tie 93.1%
Suit 88.4%

Captions

Microsoft

a person standing in front of a door 58.4%
a man and a woman standing in front of a door 35.8%
a person standing next to a door 35.7%

Text analysis

Amazon

65
JAN