Human Generated Data

Title

Untitled (men bringing body down hill into hearse)

Date

1953

People

Artist: Francis J. Sullivan, American 1916 - 1996

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19730

Human Generated Data

Title

Untitled (men bringing body down hill into hearse)

People

Artist: Francis J. Sullivan, American 1916 - 1996

Date

1953

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19730

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Person 99.7
Human 99.7
Car 97.3
Automobile 97.3
Transportation 97.3
Vehicle 97.3
Truck 95.5
Van 93.7
Person 93.2
Wheel 90
Machine 90
Caravan 85.2
Car 73.3
Housing 65.8
Building 65.8
Tarmac 65.2
Asphalt 65.2
Camping 62.8
Wheel 62.4
Shelter 56.4
Nature 56.4
Countryside 56.4
Rural 56.4
Outdoors 56.4
People 55.4
Person 44.4

Clarifai
created on 2023-10-22

people 99.8
vehicle 99.7
transportation system 98.5
group together 97.8
car 97.7
adult 96.7
driver 93.3
home 93.3
two 92.7
road 91.4
one 91.3
three 91.1
man 90.1
four 89.1
group 88.1
truck 87.9
vintage 87.7
street 87.6
no person 86.7
campsite 85.5

Imagga
created on 2022-03-05

shopping cart 100
handcart 85.3
wheeled vehicle 82.2
container 44.4
conveyance 27.1
mobile home 23.2
structure 22.6
trailer 21.2
housing 20.5
vehicle 19.8
sky 17.9
road 15.4
trees 15.1
grass 13.4
landscape 13.4
rural 13.2
house 12.5
travel 12
building 11.9
tree 11.5
scene 11.3
architecture 10.9
summer 10.9
car 10.8
home 10.4
field 10
forest 9.6
path 9.5
clouds 9.3
beach 9.3
fence 9.2
park 9.1
transportation 9
outdoors 9
light 8.7
day 8.6
industry 8.5
lawn 8.5
outdoor 8.4
new 8.1
country 7.9
chair 7.9
sand 7.9
work 7.9
empty 7.7
winter 7.7
vacation 7.4
street 7.4
speed 7.3
holiday 7.2
farm 7.1
barrier 7.1
sea 7
scenic 7

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

outdoor 98
black and white 90.2
white 89.1
black 79.9
text 77.4
house 76.1
vehicle 67.6
old 45.5

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 19-27
Gender Female, 51.3%
Happy 79.7%
Confused 9%
Calm 3.9%
Disgusted 2%
Sad 1.7%
Surprised 1.7%
Angry 1.2%
Fear 0.8%

Feature analysis

Amazon

Person
Car
Truck
Wheel
Person 99.7%
Person 93.2%
Person 44.4%
Car 97.3%
Car 73.3%
Truck 95.5%
Wheel 90%
Wheel 62.4%

Text analysis

Amazon

108
and
KAGOX