Human Generated Data

Title

Untitled (man and woman performing at the Siesta Key Actors' Theater)

Date

1969

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.11382

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (man and woman performing at the Siesta Key Actors' Theater)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1969

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-14

Person 99.3
Human 99.3
Person 98.9
Indoors 88
Interior Design 85.5
Room 79.8
Apparel 70.8
Clothing 70.8
Furniture 67.9
Female 64.2
Portrait 62
Photo 62
Photography 62
Face 62
Sink 58

Imagga
created on 2022-01-14

counter 56.4
dishwasher 33.1
white goods 27.5
home 26.3
home appliance 24.6
people 24.5
man 23.5
interior 22.1
kitchen 21.8
house 20.9
happy 20.7
smiling 20.2
appliance 19.7
person 18.5
male 18.4
adult 18.3
shop 16.9
modern 16.1
room 16
couple 14.8
table 14.7
lifestyle 14.5
smile 14.2
cheerful 13.8
shopping 12.8
business 12.7
women 12.7
family 12.4
indoors 12.3
furniture 12.2
product 12
portrait 11.6
standing 11.3
newspaper 11.2
decoration 10.8
holding 10.7
work 10.6
new 10.5
happiness 10.2
casual 10.2
team 9.8
employee 9.8
waiter 9.6
sitting 9.4
worker 9.3
child 9.2
food 8.8
gift 8.6
architecture 8.6
luxury 8.6
chair 8.5
two 8.5
buy 8.4
bakery 8.4
mature 8.4
color 8.3
durables 8.3
coffee 8.3
sale 8.3
20s 8.2
indoor 8.2
supermarket 8.2
market 8
decor 8
businessman 7.9
together 7.9
holiday 7.9
cooking 7.9
cart 7.8
men 7.7
sink 7.7
attractive 7.7
30s 7.7
office 7.7
desk 7.7
mother 7.7
customer 7.6
store 7.6
camera 7.4
mercantile establishment 7.2
creation 7.2
medical 7.1

Google
created on 2022-01-14

Microsoft
created on 2022-01-14

text 98
indoor 96.1
clothing 87.5
table 84.7
person 83.3
furniture 68.2
wedding 51.9
kitchen appliance 11.2

Face analysis

Amazon

Google

AWS Rekognition

Age 24-34
Gender Female, 93.5%
Calm 99.3%
Happy 0.2%
Disgusted 0.1%
Sad 0.1%
Surprised 0.1%
Confused 0.1%
Angry 0%
Fear 0%

AWS Rekognition

Age 26-36
Gender Female, 97%
Sad 95.2%
Calm 2.2%
Confused 0.9%
Disgusted 0.5%
Angry 0.4%
Fear 0.4%
Surprised 0.3%
Happy 0.2%

AWS Rekognition

Age 12-20
Gender Male, 99.7%
Calm 89.2%
Angry 6.6%
Surprised 2.4%
Happy 0.6%
Disgusted 0.4%
Sad 0.4%
Fear 0.2%
Confused 0.1%

AWS Rekognition

Age 16-24
Gender Male, 62.6%
Confused 62.1%
Calm 11%
Fear 6.7%
Surprised 5.8%
Sad 5.4%
Disgusted 3.6%
Happy 2.9%
Angry 2.5%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.3%

Captions

Microsoft

a man and a woman standing in a kitchen 78.8%
a man and a woman standing in front of a counter 73.5%
a man and a woman standing in front of a window 51.3%

Text analysis

Amazon

57858.
RAGOX
MJIR--YT3RA°2

Google

57858.
57858.