Human Generated Data

Title

Untitled (woman playing piano and man seated on couch in trailer, Sarasota, Florida)

Date

1954

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.11712

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (woman playing piano and man seated on couch in trailer, Sarasota, Florida)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1954

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-15

Person 99.6
Human 99.6
Person 99.5
Interior Design 98.5
Indoors 98.5
Furniture 97.6
Chair 97.6
Studio 88.2
Table 85.6
Electronics 83.5
Monitor 83.4
Display 83.4
Screen 83.4
Lighting 82.5
Tabletop 77.8
Desk 76.9
Computer 75.6
LCD Screen 71.2
Meal 65.1
Food 65.1
Pc 59.6
Room 56.4
Keyboard 55.7

Imagga
created on 2022-01-15

equipment 22.8
man 20.1
business 16.4
people 15.6
musical instrument 15.6
device 15.4
ball 14.4
person 14.2
office 14.1
modern 14
interior 13.3
computer 12.9
indoor 12.8
technology 12.6
transportation 12.5
men 12
male 11.3
adult 11.2
window 11
working 10.6
car 10.2
percussion instrument 10
travel 9.8
sport 9.8
work 9.5
shop 9.5
weight 9.4
barbershop 9.3
glass 9.3
sports equipment 9.3
leisure 9.1
exercise 9.1
vehicle 9.1
black 9
table 8.8
indoors 8.8
smiling 8.7
chair 8.6
industry 8.5
room 8.5
3d 8.5
dumbbell 8.4
radio 8.3
inside 8.3
holding 8.2
kitchen 8
lifestyle 7.9
women 7.9
machine 7.7
concentration 7.7
game equipment 7.6
soccer ball 7.6
communication 7.5
one 7.5
laptop 7.4
building 7.3
cheerful 7.3
hat 7.3
driver 7.3
fitness 7.2
team 7.2
businessman 7.1
happiness 7

Google
created on 2022-01-15

Photograph 94.2
Black 89.6
Black-and-white 84.5
Style 84
Window 82.3
Chair 79
Snapshot 74.3
Monochrome photography 71.8
Monochrome 70.5
Machine 67.5
Sitting 64.3
Stock photography 64
Stool 63.4
Rectangle 62.7
Font 61.7
Room 58.6
Suit 58.1
Grille 55.8
Desk 55.6
Table 54.8

Microsoft
created on 2022-01-15

text 95.3
black and white 93.7
window 92.2
furniture 82.2
chair 78.3
monochrome 55.5
table 55.1
computer 38.2

Face analysis

Amazon

Google

AWS Rekognition

Age 52-60
Gender Female, 79.5%
Calm 73.2%
Happy 25.2%
Sad 0.8%
Confused 0.3%
Surprised 0.2%
Disgusted 0.1%
Angry 0.1%
Fear 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.6%

Captions

Microsoft

a man standing in front of a window 39.7%
a man that is standing in front of a window 37.2%
a man standing next to a window 31.5%

Text analysis

Amazon

39856
حعمء
حعمء VAGON
VAGON

Google

398
56
398 56