Human Generated Data

Title

Untitled (portrait of an older couple seated in wicker chairs)

Date

1930

People

Artist: Hamblin Studio, American active 1930s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.1886

Human Generated Data

Title

Untitled (portrait of an older couple seated in wicker chairs)

People

Artist: Hamblin Studio, American active 1930s

Date

1930

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Person 99.1
Human 99.1
Furniture 98
Person 97.6
Apparel 96.7
Clothing 96.7
Chair 91.5
Couch 83.8
Face 75.7
Sitting 75.1
Female 66.2
People 63.7
Suit 58.1
Coat 58.1
Overcoat 58.1

Imagga
created on 2021-12-14

office 50.9
business 41.9
businessman 39.8
man 39.7
person 38.3
computer 38.1
people 37.4
laptop 36.1
male 34.1
working 32.7
professional 32.1
adult 31.5
businesswoman 30.9
work 30.8
businesspeople 30.4
room 30
corporate 28.4
desk 27.6
table 26.6
job 24.8
executive 23.6
indoors 22.9
meeting 22.6
happy 22.6
team 21.5
group 21
worker 21
sitting 19.8
colleagues 19.4
newspaper 19.1
manager 18.6
smiling 18.1
men 18.1
patient 17.8
teacher 17.6
teamwork 17.6
indoor 17.4
smile 17.1
technology 17.1
classroom 17
workplace 16.2
success 16.1
communication 16
successful 15.6
talking 15.2
women 15
occupation 13.8
20s 13.8
lifestyle 13.7
confident 13.7
looking 13.6
home 13.6
casual 13.6
product 13
education 13
portrait 13
associates 12.8
suit 12.6
together 12.3
formal 11.5
face 11.4
modern 11.2
expression 11.1
conference 10.8
reading 10.5
document 10.2
happiness 10.2
creation 10.1
horizontal 10.1
monitor 10
consultant 9.7
student 9.7
scholar 9.7
career 9.5
keyboard 9.4
company 9.3
notebook 9.3
hand 9.1
attractive 9.1
cheerful 8.9
handsome 8.9
discussing 8.8
coworkers 8.8
medical 8.8
discussion 8.8
paper 8.7
cooperation 8.7
busy 8.7
hospital 8.7
illness 8.6
college 8.5
doctor 8.5
learning 8.5
employee 8.5
finance 8.5
clinic 8.3
alone 8.2
chair 8.2
educator 8.2
bright 7.9
day 7.9
businessperson 7.8
partners 7.8
jacket 7.7
intellectual 7.7
30s 7.7
partnership 7.7
boss 7.7
shop 7.5
contemporary 7.5
house 7.5
screen 7.5
presentation 7.5
sick person 7.4
focus 7.4
case 7.3
furniture 7.3
color 7.2
nurse 7.1
interior 7.1

Google
created on 2021-12-14

Microsoft
created on 2021-12-14

text 99.4
clothing 89.9
person 88.3
furniture 85.5
black and white 80
chair 68.6
wedding dress 62.7
man 60.7

Face analysis

Amazon

Google

AWS Rekognition

Age 51-69
Gender Male, 97.1%
Calm 80.2%
Happy 13.1%
Sad 2.4%
Surprised 2.3%
Confused 1%
Angry 0.7%
Fear 0.3%
Disgusted 0.1%

AWS Rekognition

Age 49-67
Gender Male, 95.9%
Happy 52.2%
Calm 25.8%
Sad 18.1%
Confused 2.7%
Angry 0.5%
Disgusted 0.2%
Fear 0.2%
Surprised 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.1%
Chair 91.5%

Captions

Microsoft

a person holding a book 60.2%
a person sitting on a book 43.3%
a person sitting on top of a book 43.2%