Human Generated Data

Title

Untitled (children gathered around mule with cart)

Date

1969

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.11307

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (children gathered around mule with cart)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1969

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Person 98.6
Human 98.6
Person 98
Person 98
Wheel 97.3
Machine 97.3
Person 96.6
Person 96
Vehicle 93.9
Transportation 93.9
Wagon 93.9
Person 89.7
Carriage 87.3
Spoke 85.1
Person 83.8
Person 82.5
Horse Cart 71.3
Person 69.8
Outdoors 67.9
Nature 67.2
Building 67
Housing 67
Person 58.3
Cannon 55.9
Weaponry 55.9
Weapon 55.9

Imagga
created on 2022-01-09

picket fence 33.9
snow 31
fence 29.9
winter 23.8
sky 23
landscape 20.8
barrier 20.5
structure 20.4
scene 18.2
cold 18.1
carriage 16.3
travel 16.2
weather 15.6
scenery 15.3
old 14.6
season 14
forest 13.9
outdoor 13.8
obstruction 13.7
building 13.4
trees 13.3
park 13.2
tree 13.1
beach 12.6
wheeled vehicle 12.5
holiday 12.2
road 11.7
architecture 11.7
city 11.6
rural 11.5
bench 11.3
grunge 11.1
dark 10.9
silhouette 10.8
night 10.7
negative 10.7
sun 10.6
frost 10.6
house 10
film 9.9
vintage 9.9
frosty 9.8
cool 9.8
cloud 9.5
cityscape 9.5
vehicle 9.4
day 9.4
light 9.4
ice 9.2
street 9.2
field 9.2
environment 9.1
sunset 9
seasonal 8.8
seat 8.7
fog 8.7
water 8.7
black 8.4
sign 8.3
tourism 8.3
pattern 8.2
horizon 8.1
transportation 8.1
park bench 8.1
scenic 7.9
urban 7.9
freight car 7.9
car 7.8
sea 7.8
freeze 7.8
construction 7.7
frozen 7.6
outdoors 7.6
skyline 7.6
vacation 7.4
transport 7.3
morning 7.2

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

outdoor 98.3
text 97.6
horse 96.2
drawn 94.2
carriage 80.1
tree 79.5
house 77.6
old 75.8
sky 57
cart 31.9

Face analysis

Amazon

AWS Rekognition

Age 29-39
Gender Male, 98.7%
Calm 70%
Sad 20.7%
Happy 4.8%
Confused 1.9%
Angry 1.1%
Surprised 0.7%
Disgusted 0.6%
Fear 0.2%

AWS Rekognition

Age 38-46
Gender Male, 73.6%
Calm 59.1%
Happy 23.5%
Fear 11%
Sad 3.8%
Angry 0.8%
Disgusted 0.7%
Confused 0.6%
Surprised 0.5%

AWS Rekognition

Age 29-39
Gender Female, 85%
Calm 51.5%
Fear 40.8%
Surprised 2.7%
Happy 1.7%
Sad 1.3%
Disgusted 1.2%
Confused 0.4%
Angry 0.4%

Feature analysis

Amazon

Person 98.6%
Wheel 97.3%

Captions

Microsoft

an old photo of a horse drawn carriage 96.7%
old photo of a horse drawn carriage 95.8%
a person riding a horse drawn carriage 92.5%

Text analysis

Amazon

KODAK
577KODAK
KODAK C.VEE1XEV.S
АТОРАЯА2
АСТЯОЛА АТОРАЯА2 ДЕМИЕТС
АСТЯОЛА
ДЕМИЕТС
C.VEE1XEV.S

Google

ACHOH ATOZASAZ JEMVIEI
ACHOH
ATOZASAZ
JEMVIEI