Human Generated Data

Title

Untitled (people in carriage, crowd watching)

Date

1938

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19532

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (people in carriage, crowd watching)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1938

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Machine 99.7
Wheel 99.7
Mammal 99.2
Horse 99.2
Animal 99.2
Horse Cart 98.8
Wagon 98.8
Transportation 98.8
Vehicle 98.8
Bicycle 93.2
Bike 93.2
Person 86.9
Human 86.9
Wheel 81.8
Person 71.9
Carriage 71
Person 60.3

Imagga
created on 2022-03-05

cart 100
horse cart 100
wagon 97.6
wheeled vehicle 66.5
carriage 46.2
vehicle 37.7
horse 33.5
transportation 22.4
old 21.6
rural 19.4
animal 17.1
wheel 17
transport 14.6
outdoors 14.2
travel 14.1
man 13.4
horses 12.7
farm 12.5
city 12.5
wheelchair 12.4
urban 11.4
antique 11.3
street 11
people 10.6
landscape 10.4
sky 10.2
male 9.9
chair 9.9
fence 9.7
building 9.5
grass 9.5
winter 9.4
outdoor 9.2
historic 9.2
park 9.1
history 8.9
disabled 8.9
bicycle 8.8
country 8.8
riding 8.8
ride 8.7
architecture 8.6
speed 8.2
care 8.2
road 8.1
sunset 8.1
snow 8
mammal 8
bike 7.8
scene 7.8
driver 7.8
sick 7.7
beach 7.6
historical 7.5
wood 7.5
vintage 7.4
vacation 7.4
water 7.3
countryside 7.3
tourist 7.3
seat 7.2
trees 7.1
summer 7.1
sea 7

Google
created on 2022-03-05

Wheel 95.4
Working animal 87
Cart 86
Motor vehicle 85.3
Vehicle 83.3
Carriage 80.2
Art 77.8
Pack animal 76.4
Horse and buggy 74.4
Horse tack 72.7
Rein 65.6
Bovine 65
Livestock 64.7
Paper product 64.3
Horse supplies 62.3
Font 61.5
Illustration 60.7
Tire 60.5
Monochrome 59.5
Bridle 59.4

Microsoft
created on 2022-03-05

horse 97.4
text 93.8
outdoor 90.3
drawn 86.5
black and white 83.9
transport 78
black 65.5
vehicle 63.7
land vehicle 61.5
horse-drawn vehicle 56
old 52
carriage 51.6
cart 43.7

Face analysis

Amazon

AWS Rekognition

Age 18-24
Gender Female, 98.3%
Sad 99.1%
Calm 0.3%
Fear 0.2%
Disgusted 0.1%
Angry 0.1%
Confused 0.1%
Happy 0.1%
Surprised 0%

Feature analysis

Amazon

Wheel 99.7%
Horse 99.2%
Bicycle 93.2%
Person 86.9%

Captions

Microsoft

a person riding a horse drawn carriage in front of a building 90%
a person riding a horse drawn carriage 89.9%
a horse drawn carriage in front of a building 89.8%

Text analysis

Amazon

7705
A70A

Google

9105
9105