Human Generated Data

Title

Untitled (man and woman sitting inside trailer home)

Date

c. 1950

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8867

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (man and woman sitting inside trailer home)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-15

Person 99.5
Human 99.5
Person 99.5
Clothing 91
Apparel 91
Sitting 90.6
Person 89.4
Shelf 87.6
Shop 86.8
Chair 84.2
Furniture 84.2
Shoe Shop 70
Workshop 56.4
Undershirt 56.4

Imagga
created on 2022-01-15

case 52.8
interior 48.6
shop 46.8
salon 45
table 30.6
furniture 28.8
modern 28
mercantile establishment 27.8
chair 27.7
indoors 27.2
room 26.1
design 24.2
window 23.5
house 23.4
home 23.1
inside 23
kitchen 20.8
decor 20.3
shoe shop 18.5
place of business 18.5
wood 18.3
architecture 17.9
lamp 17.1
business 17
floor 16.7
counter 16.6
barbershop 16.4
glass 16.3
indoor 15.5
light 15.4
office 14.9
style 14.8
luxury 14.6
people 13.9
restaurant 13.8
decoration 13.7
buy 13.1
urban 13.1
apartment 12.4
comfortable 11.4
dining 11.4
building 11.4
3d 10.8
chairs 10.8
empty 10.3
cabinet 10.3
food 10.3
work 10.2
equipment 10.1
man 10.1
computer 9.6
expensive 9.6
mirror 9.5
tile 9.5
wall 9.4
contemporary 9.4
lifestyle 9.4
establishment 9.1
oven 9.1
center 9
seat 8.8
scene 8.6
elegant 8.6
nobody 8.5
store 8.5
horizontal 8.4
sale 8.3
bar 8.3
city 8.3
fashion 8.3
shopping 8.3
plant 8.2
steel 7.9
stove 7.9
black 7.8
mall 7.8
render 7.8
residential 7.7
retail 7.6
relax 7.6
living 7.6
cook 7.3
new 7.3
stylish 7.2
shelf 7.2
working 7.1
life 7
structure 7

Google
created on 2022-01-15

Microsoft
created on 2022-01-15

text 99.4
black and white 90.7
furniture 87.2
person 73.4
store 60.7
table 58.7
shop 12.8

Face analysis

Amazon

Google

AWS Rekognition

Age 16-24
Gender Female, 81%
Calm 96.7%
Sad 1.2%
Confused 0.9%
Happy 0.3%
Angry 0.3%
Surprised 0.2%
Disgusted 0.2%
Fear 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.5%
Chair 84.2%

Captions

Microsoft

a person standing in front of a store window 64.3%
a group of people standing in front of a store window 57.9%
a group of people in front of a store window 57%

Text analysis

Amazon

39858
g
123092
VISION

Google

3985 8
3985
8