Human Generated Data

Title

Untitled (back patio barbeque)

Date

1959, printed later

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.194

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (back patio barbeque)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1959, printed later

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Person 99.7
Human 99.7
Person 99.7
Person 97.2
Person 96.7
Person 90.5
Clothing 63.7
Apparel 63.7
Person 62.3
Building 61
Furniture 59.4
Porch 59
Neighborhood 57.4
Urban 57.4
Canopy 57.4
Awning 57.4
Chair 56.3
Clinic 55.1

Imagga
created on 2022-01-08

office 58.6
room 44.2
center 41.5
classroom 39.7
computer 33.7
desk 31.9
laptop 30.3
people 30.1
business 29.1
work 28.3
man 27
working 26.5
table 25.6
interior 24.8
indoors 24.6
person 23.2
businessman 19.4
team 18.8
male 18.4
businesswoman 18.2
adult 18
group 17.7
sitting 17.2
meeting 16.9
home 16.7
teamwork 16.7
modern 16.1
job 15.9
executive 14.7
communication 14.3
businesspeople 14.2
house 14.2
happy 13.8
corporate 13.7
men 13.7
professional 13.7
chair 13.5
technology 13.4
worker 12.5
furniture 12
teacher 11.7
lifestyle 11.6
smile 11.4
success 11.3
study 11.2
women 11.1
indoor 11
smiling 10.8
newspaper 10.8
kitchen 10.7
student 10.6
studying 10.5
notebook 10.5
workplace 10.5
learning 10.3
monitor 10.3
floor 10.2
equipment 10.2
conference 9.8
education 9.5
building 9.4
design 9
school 8.8
couple 8.7
class 8.7
paper 8.6
reading 8.6
casual 8.5
contemporary 8.5
presentation 8.4
idea 8
together 7.9
attractive 7.7
exam 7.7
wireless 7.6
shop 7.6
keyboard 7.5
row 7.4
inside 7.4
light 7.3
portrait 7.1
to 7.1
architecture 7

Google
created on 2022-01-08

Microsoft
created on 2022-01-08

text 98
house 92.5
table 90.3
outdoor 85.8
black and white 85.7
furniture 84
person 83.9
clothing 57.2

Face analysis

Amazon

Google

AWS Rekognition

Age 42-50
Gender Male, 99.9%
Calm 99%
Confused 0.3%
Surprised 0.2%
Sad 0.1%
Disgusted 0.1%
Fear 0.1%
Happy 0.1%
Angry 0.1%

AWS Rekognition

Age 48-54
Gender Male, 99.5%
Happy 76.7%
Angry 11.7%
Disgusted 5.4%
Sad 3.5%
Calm 1.7%
Fear 0.4%
Surprised 0.4%
Confused 0.2%

AWS Rekognition

Age 56-64
Gender Female, 57.4%
Calm 81.9%
Sad 9.3%
Fear 2.5%
Disgusted 2%
Confused 1.5%
Angry 1.2%
Happy 1.1%
Surprised 0.4%

AWS Rekognition

Age 28-38
Gender Female, 96.8%
Sad 96.3%
Disgusted 1.4%
Calm 0.8%
Fear 0.6%
Confused 0.3%
Happy 0.3%
Angry 0.2%
Surprised 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Feature analysis

Amazon

Person 99.7%
Chair 56.3%

Text analysis

Google

IRY
CKIK IRY
CKIK