Human Generated Data

Title

Untitled (two costumed women sitting at a table)

Date

1947

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5507

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (two costumed women sitting at a table)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1947

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Person 97.3
Human 97.3
Person 96.9
Person 96.3
Person 94
Clinic 87.9
Art 86.4
Room 75.3
Indoors 75.3
Painting 68.6
Hospital 55.2
Operating Theatre 55.2
Person 51

Imagga
created on 2022-01-23

television 48.3
monitor 31.2
broadcasting 20.8
electronic equipment 19
container 18.8
telecommunication system 18.4
shredder 16.6
telecommunication 16.5
equipment 16.1
old 14.6
business 14.6
money 14.5
device 14.3
cash 13.7
bin 12.9
currency 12.6
bank 12.5
ashcan 12.5
digital 12.1
art 12
close 12
blackboard 11.9
finance 11.8
black 11.4
ice 10.9
financial 10.7
bill 10.5
detail 10.5
paper 10.4
medium 10.3
glass 10.3
savings 10.2
banking 10.1
symbol 10.1
water 10
bills 9.7
building 9.6
design 9.6
cold 9.5
architecture 9.4
dollar 9.3
science 8.9
pattern 8.9
liquid 8.7
global 8.2
technology 8.2
computer 8.1
wealth 8.1
clear 7.8
us 7.7
exchange 7.6
hand 7.6
house 7.5
splash 7.5
one 7.5
rich 7.4
light 7.3
backgrounds 7.3
history 7.2
wet 7.2
cool 7.1
world 7.1
modern 7

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

text 99.7
drawing 99.3
sketch 97.8
painting 96.8
book 94.8
table 77.8
child art 70.2
old 67.7
furniture 62.1
cartoon 50.5
clothes 38.9
several 10.8

Face analysis

Amazon

AWS Rekognition

Age 24-34
Gender Female, 52.8%
Happy 66.8%
Calm 30.2%
Surprised 0.7%
Sad 0.6%
Confused 0.6%
Angry 0.4%
Disgusted 0.4%
Fear 0.2%

AWS Rekognition

Age 45-53
Gender Male, 70.7%
Sad 33.7%
Calm 31.9%
Happy 17.8%
Disgusted 5%
Confused 3.8%
Surprised 3.5%
Angry 2.3%
Fear 2%

AWS Rekognition

Age 27-37
Gender Male, 99.7%
Calm 51.3%
Sad 36.4%
Confused 3.4%
Happy 3%
Angry 2.4%
Surprised 2.3%
Disgusted 0.7%
Fear 0.5%

Feature analysis

Amazon

Person 97.3%
Painting 68.6%

Captions

Microsoft

a vintage photo of a person 60.5%
a vintage photo of a group of people in a room 60.4%
a vintage photo of some people 60.3%

Text analysis

Amazon

22649
VT3743
PLACE VT3743 02240
02240
PLACE

Google

22649
22649