Human Generated Data

Title

Untitled (couple sitting on bench)

Date

1965

People

Artist: Robert Burian, American active 1940s-1950s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19287

Human Generated Data

Title

Untitled (couple sitting on bench)

People

Artist: Robert Burian, American active 1940s-1950s

Date

1965

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-25

Human 99
Person 99
Person 96.2
Clothing 92.3
Apparel 92.3
Furniture 87.1
Sitting 87.1
Text 83.6
Bench 75.6
Overcoat 65.4
Coat 65.4
Suit 65.4
Plant 65.4
People 62.3
Monitor 57.8
Display 57.8
LCD Screen 57.8
Electronics 57.8
Screen 57.8
Flower 56.5
Blossom 56.5

Imagga
created on 2022-02-25

person 43.4
business 43.1
office 38.6
professional 35.9
businessman 34.4
man 34.3
laptop 33.7
corporate 31.8
scholar 31.6
businesswoman 30.9
adult 29.8
people 29.6
male 29.1
work 27.5
computer 26.6
intellectual 26.2
happy 25.7
executive 24.6
job 23.9
smiling 21.7
working 21.2
success 20.9
smile 20
worker 19.9
communication 19.3
sitting 18.9
holding 18.2
suit 18.1
team 17.9
group 17.7
sax 17.6
meeting 17
musical instrument 16.9
wind instrument 16.8
manager 16.8
table 16.6
occupation 16.5
confident 16.4
men 16.3
desk 16.2
businesspeople 16.1
successful 15.6
women 15
notebook 15
handsome 14.3
life 14.2
colleagues 13.6
portrait 13.6
modern 13.3
attractive 13.3
technology 12.6
indoors 12.3
career 12.3
education 12.1
teacher 11.8
busy 11.6
boss 11.5
cheerful 11.4
teamwork 11.1
room 11.1
director 10.2
friendly 10.1
employee 10.1
face 9.9
corporation 9.7
looking 9.6
student 9.3
coffee 9.3
alone 9.1
black 9
couple 8.7
standing 8.7
paper 8.6
confidence 8.6
workplace 8.6
tie 8.5
brass 8.5
expression 8.5
doctor 8.5
finance 8.5
study 8.4
indoor 8.2
lady 8.1
accordion 8.1
medical 7.9
together 7.9
happiness 7.8
conference 7.8
discussion 7.8
pretty 7.7
staff 7.7
casual 7.6
talking 7.6
one person 7.5
human 7.5
positive 7.4
newspaper 7.3
home 7.2
building 7.1

Google
created on 2022-02-25

Microsoft
created on 2022-02-25

text 99.2
wall 96
person 86.6
clothing 83
man 50.9
picture frame 12.5

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 18-26
Gender Female, 98.5%
Happy 99.8%
Surprised 0.1%
Confused 0%
Calm 0%
Angry 0%
Sad 0%
Fear 0%
Disgusted 0%

AWS Rekognition

Age 16-24
Gender Female, 65.4%
Confused 35.6%
Happy 25.2%
Surprised 11.7%
Calm 9.3%
Disgusted 7.2%
Angry 7%
Sad 2.7%
Fear 1.3%

Microsoft Cognitive Services

Age 24
Gender Male

Microsoft Cognitive Services

Age 19
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Possible
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99%
Bench 75.6%

Captions

Microsoft

a person sitting in front of a sign 54.1%
a person sitting on top of a sign 50.8%
a person holding a sign 46.6%

Text analysis

Amazon

65
JAN
129
132

Google

JAN /32 129
/32
JAN
129