Human Generated Data

Title

Untitled (car crashed into tree)

Date

c. 1970, printed from 1954 negative

People

Artist: Francis J. Sullivan, American 1916 - 1996

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.18490

Human Generated Data

Title

Untitled (car crashed into tree)

People

Artist: Francis J. Sullivan, American 1916 - 1996

Date

c. 1970, printed from 1954 negative

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-25

Person 99.3
Human 99.3
Person 98.5
Car 98.1
Automobile 98.1
Vehicle 98.1
Transportation 98.1
Machine 96.2
Wheel 96.2
Bumper 95.8
Car 88.9
Wheel 88.7
Offroad 79.4
Tire 74.7
Person 60.5
Car 58.4
Spoke 55.7

Imagga
created on 2022-02-25

motor vehicle 48.3
car 44.4
wheeled vehicle 37.5
beach 33.7
vehicle 32.4
beach wagon 28.3
sand 28.2
sea 28.2
boat 27.9
sky 25.5
coast 24.2
ocean 24.1
golf equipment 22.3
water 22
transportation 20.6
travel 20.4
vacation 19.6
summer 18.6
fishing 18.3
shore 18.1
machine 16.9
sports equipment 16.7
landscape 16.4
equipment 15.8
mechanical device 15.3
swing 15
backhoe 15
ship 14.7
holiday 14.3
transport 13.7
road 13.5
old 13.2
mechanism 13
island 12.8
tourism 12.4
wood 11.7
broken 11.6
dirt 11.5
mobile home 11.4
plaything 11.4
yellow 11.3
tropical 11.1
clouds 11
power shovel 11
wheel 10.5
device 10.5
housing 10
outdoor 9.9
tire 9.7
boats 9.7
scenic 9.7
harbor 9.6
auto 9.6
cloud 9.5
industry 9.4
trailer 9.3
tool 9.3
land 9.2
sun 8.9
machinery 8.8
dock 8.8
scene 8.7
work 8.6
sunny 8.6
structure 8.6
construction 8.6
coastline 8.5
desert 8.4
leisure 8.3
environment 8.2
tourist 8.2
trees 8
camper 7.9
wooden 7.9
grass 7.9
day 7.8
amphibian 7.8
tide 7.8
wave 7.8
vessel 7.7
track 7.7
relax 7.6
truck 7.5
earth 7.5
vintage 7.4
chair 7.3
metal 7.2
horizon 7.2
sunset 7.2
wreck 7.1
rural 7

Google
created on 2022-02-25

Microsoft
created on 2022-02-25

outdoor 99.9
sky 99.2
vehicle 97.8
land vehicle 96.9
grass 95.5
car 94.1
wheel 88.9
text 78.7
tire 70.6
old 57.4
auto part 54.4
trailer 40.2
dirt 25.7
van 24.1

Face analysis

Amazon

AWS Rekognition

Age 19-27
Gender Male, 74.1%
Calm 97.5%
Sad 0.7%
Happy 0.6%
Surprised 0.3%
Disgusted 0.2%
Angry 0.2%
Fear 0.2%
Confused 0.2%

AWS Rekognition

Age 18-26
Gender Male, 99.7%
Calm 78.5%
Sad 7.5%
Happy 6.6%
Angry 3.1%
Fear 1.6%
Surprised 1.4%
Disgusted 0.8%
Confused 0.5%

Feature analysis

Amazon

Person 99.3%
Car 98.1%
Wheel 96.2%

Captions

Microsoft

a car parked on the side of a dirt field 92.3%
a car parked in a dirt field 90.7%
a car parked on a dirt road 90.6%

Text analysis

Amazon

609

Google

4609
4609