Human Generated Data

Title

Untitled (three women in a prop closet, Hedgerow Theater, PA)

Date

c. 1938

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.12018

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (three women in a prop closet, Hedgerow Theater, PA)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1938

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-15

Person 99
Human 99
Person 95.6
Person 87.4
Person 87
Person 78.6
Clothing 72.1
Apparel 72.1
Person 66.1
Advertisement 61.3
Workshop 59.1
Poster 58.1
Shop 56.3
Living Room 55.7
Indoors 55.7
Room 55.7
Shelf 55.6

Imagga
created on 2022-01-15

shop 54.5
shoe shop 43.3
mercantile establishment 36.1
refrigerator 31.5
white goods 27.1
place of business 24
newspaper 21.1
home appliance 20.4
interior 20.3
product 17.5
design 16.9
stall 16.5
sketch 15.5
business 15.2
window 15
modern 14
appliance 13.6
barbershop 12.8
drawing 12.2
creation 12.1
home 12
establishment 11.9
people 11.7
retro 11.5
buy 11.3
architecture 10.9
man 10.7
wall 10.3
city 10
fashion 9.8
building 9.8
old 9.7
technology 9.6
black 9.6
center 9.5
daily 9.5
store 9.4
comic book 9.3
house 9.2
equipment 9.1
art 9.1
texture 9
urban 8.7
retail 8.5
floor 8.4
sale 8.3
decoration 8.3
case 8.3
style 8.2
working 7.9
indoors 7.9
colorful 7.9
rack 7.8
paper 7.8
mall 7.8
person 7.7
construction 7.7
flower 7.7
wallpaper 7.7
casual 7.6
cable 7.6
hand 7.6
pattern 7.5
display 7.4
sport 7.4
inside 7.4
shopping 7.3
furniture 7.3
graphic 7.3
digital 7.3
room 7.3
computer 7.2
lines 7.2
table 7.1
women 7.1
server 7

Microsoft
created on 2022-01-15

indoor 98.5
kitchen 96.8
text 96.5
person 92
clothing 91.9
drawing 90.6
black and white 88.8
sketch 75.6
open 46.8
messy 32.6
cluttered 31

Face analysis

Amazon

Google

AWS Rekognition

Age 24-34
Gender Female, 98.6%
Calm 81.5%
Sad 18.3%
Disgusted 0%
Surprised 0%
Angry 0%
Happy 0%
Fear 0%
Confused 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Feature analysis

Amazon

Person 99%

Captions

Microsoft

a group of people standing in a kitchen 74.1%
a group of people standing in a kitchen preparing food 65.9%
a group of people in a kitchen preparing food 65.8%

Text analysis

Amazon

8
YT37A2
830И3330
32AB YT37A2 830И3330
32AB

Google

DELENDEB 2VLEIA BV2E
DELENDEB
2VLEIA
BV2E