Human Generated Data

Title

Dresser Drawer, near Norfolk, Nebraska

Date

1947, printed later

People

Artist: Wright Morris, American 1910 - 1998

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Richard and Ronay Menschel Fund for the Acquisition of Photographs, 2010.540.6

Copyright

© Estate of Wright Morris, Courtesy of the Center for Creative Photography

Human Generated Data

Title

Dresser Drawer, near Norfolk, Nebraska

People

Artist: Wright Morris, American 1910 - 1998

Date

1947, printed later

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2019-04-06

Tower 83.2
Architecture 83.2
Clock Tower 83.2
Building 83.2
Apparel 82
Clothing 82
Accessories 79.8
Accessory 79.8
Bag 77.1
Handbag 77.1
Purse 69.4
Furniture 67.5
Indoors 63.5
Room 63.5
Goggles 59.2

Clarifai
created on 2018-03-23

people 97.4
vehicle 93.8
monochrome 91.9
no person 91.1
military 90.8
wear 90.6
group 90.6
adult 90.2
room 89.9
one 89.4
indoors 88.8
war 87.4
furniture 86
retro 85
old 84.3
two 84.1
man 80.9
industry 79.5
art 78.7
many 77.5

Imagga
created on 2018-03-23

paper 29.4
currency 28.7
cash 28.4
money 28.1
container 26.1
business 23.7
dollar 23.2
finance 22.8
bank 22.4
wealth 21.5
banking 20.2
envelope 18.8
exchange 17.2
bill 17.1
grunge 16.2
financial 16
investment 15.6
hundred 15.5
dollars 15.4
packet 15.1
banknote 14.5
design 14.2
savings 14
economy 13.9
package 13.6
loan 13.4
book 12.2
stamp 12.1
note 11.9
old 11.8
pay 11.5
vintage 11.5
symbol 11.4
black 11
bills 10.7
market 10.6
stall 10.6
product 10.4
commerce 10.3
rich 10.2
frame 10
art 9.9
retro 9.8
sign 9.8
texture 9.7
paying 9.7
debt 9.6
us 9.6
payment 9.6
film 9.6
pattern 9.6
stack 9.2
dirty 9
blackboard 8.9
success 8.8
finances 8.7
card 8.7
blank 8.6
close 8.6
security 8.3
creation 8.1
antique 7.8
rate 7.8
3d 7.7
change 7.7
house 7.5
buy 7.5
die 7.5
graphic 7.3
digital 7.3

Google
created on 2018-03-23

Microsoft
created on 2018-03-23

text 88.1
different 49.6

Face analysis

Amazon

AWS Rekognition

Age 26-43
Gender Male, 50.5%
Disgusted 50.1%
Happy 49.5%
Calm 49.6%
Sad 49.6%
Surprised 49.5%
Angry 49.5%
Confused 49.6%

AWS Rekognition

Age 16-27
Gender Female, 50.2%
Angry 49.5%
Sad 49.7%
Confused 49.5%
Surprised 49.5%
Happy 49.5%
Disgusted 49.5%
Calm 50.2%

AWS Rekognition

Age 26-44
Gender Female, 50.4%
Surprised 49.6%
Angry 49.6%
Happy 49.6%
Disgusted 49.6%
Confused 49.6%
Calm 49.9%
Sad 49.7%

Feature analysis

Amazon

Clock Tower 83.2%

Captions

Microsoft

a group of items on a tabletop 44.6%
a group of items in it 44.5%
a bunch of items that are on a table 44.4%

Text analysis

Amazon

Express
Remington
O.E.BULLIS
h
SLLIS
wrig
h 7>)->3
P.
7>)->3
me
PX4106
P. P.JO:.SON 4100A. PX4106
4100A.
wrig Anorm
P.JO:.SON
Anorm
oe

Google

O.E.BULLIS
O.E.BULLIS