Human Generated Data

Title

Untitled (woman and child in front of fruit stand)

Date

c. 1950

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7514

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (woman and child in front of fruit stand)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7514

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Person 97.1
Human 97.1
Person 95
Person 93.9
Car 92.5
Automobile 92.5
Vehicle 92.5
Transportation 92.5
Shop 90.9
Bazaar 70.6
Market 70.6
Spoke 64.2
Machine 64.2
Tire 58.7
Workshop 55.5

Clarifai
created on 2023-10-25

people 99.8
monochrome 99.6
street 97.9
adult 97.5
vehicle 96.9
man 96.6
transportation system 96.3
group together 93.9
group 93.2
many 89.4
stock 88.8
war 88.6
no person 87.7
commerce 87.5
two 84
merchant 81.6
market 81.4
city 80.4
road 79.7
one 79.2

Imagga
created on 2022-01-08

shop 36
city 32.4
building 32.2
architecture 30.5
mercantile establishment 28.4
stall 24.3
urban 21.8
travel 21.8
place of business 18.6
structure 17.1
old 15.3
transportation 15.2
town 14.8
bakery 14.8
street 14.7
business 13.4
center 13.4
office 13
shoe shop 12.8
sky 12.1
industry 12
construction 11.1
finance 11
transport 11
house 10.9
tourism 10.7
light 10
landmark 9.9
sign 9.8
place 9.3
restaurant 9.2
vintage 9.1
road 9
tower 8.9
night 8.9
establishment 8.8
downtown 8.6
cityscape 8.5
buildings 8.5
modern 8.4
famous 8.4
exterior 8.3
history 8
water 8
blackboard 7.9
room 7.8
scene 7.8
financial 7.1
glass 7

Google
created on 2022-01-08

Microsoft
created on 2022-01-08

text 99.9
black and white 90.8
street 67.7
several 10.6

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 33-41
Gender Female, 92.6%
Calm 96.2%
Sad 1.5%
Happy 1%
Surprised 0.6%
Fear 0.3%
Disgusted 0.2%
Confused 0.2%
Angry 0.1%

Feature analysis

Amazon

Person 97.1%
Car 92.5%

Categories

Captions

Microsoft
created on 2022-01-08

a sign above a store 87.9%
a person standing in front of a store 65.6%
a store front at day 65.5%

Text analysis

Amazon

6000
OVER
OUR
TRE
WE
FRUIT
BEST
OUR OWN FRUIT OVER 6000 TRE
CITRUS
W
OWN
ORA
AVER
W AUD OWN CRUIT AVER
BEST YO
CRUIT
-
Coca-Cola
AUD
Horida
YO
and
frus
424191A--1

Google

ORA BEST YO QUD OWN CRIIIT AVED OUR OWAL FRUIT OVER 6000 TRE Coca-CoL WE lorida CITRUS
ORA
BEST
YO
QUD
OWN
CRIIIT
AVED
OUR
OWAL
FRUIT
OVER
6000
TRE
Coca-CoL
WE
lorida
CITRUS