Human Generated Data

Title

Porter Funeral Home

Date

1950s

People

Artist: Terry Wood, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.666

Human Generated Data

Title

Porter Funeral Home

People

Artist: Terry Wood, American 20th century

Date

1950s

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Tire 99.7
Machine 99.6
Car 99.3
Automobile 99.3
Transportation 99.3
Vehicle 99.3
Spoke 99.1
Person 99
Human 99
Person 99
Car Wheel 98.4
Alloy Wheel 97.3
Wheel 96.6
Wheel 96.4
Person 91.2
Coupe 86.9
Sports Car 86.9
Person 85.1
Hot Rod 56.2

Imagga
created on 2022-01-08

garage 75.6
car 35.1
vehicle 27.7
house 27.6
home 23.1
old 22.3
structure 21.4
farm 19.6
mobile home 19.2
housing 19.1
tire 18.5
trailer 18.4
rural 17.6
transportation 17
building 16.9
architecture 16.4
wheeled vehicle 16.3
country 15.8
estate 15.2
automobile 14.4
auto 13.4
real 13.3
wheel 13
road 12.6
wood 12.5
wooden 12.3
transport 11.9
hoop 11.8
bungalow 11.5
sky 11.5
grass 11.1
speed 11
residence 10.7
roof 10.6
travel 10.6
rustic 10.5
modern 10.5
hut 10.5
drive 10.4
construction 10.3
window 10.1
door 10
new 9.7
residential 9.6
antique 9.5
fast 9.3
sport 9.1
vintage 9.1
barn 9
landscape 8.9
band 8.7
luxury 8.6
outdoor 8.4
land 8.3
historic 8.2
countryside 8.2
truck 8.1
suburban 7.9
motor vehicle 7.9
cottage 7.8
wheels 7.8
property 7.7
summer 7.7
tree 7.7
machine 7.6
shelter 7.6
field 7.5
exterior 7.4
light 7.3
work 7.2
trees 7.1
agriculture 7

Google
created on 2022-01-08

Microsoft
created on 2022-01-08

outdoor 99.8
building 99.7
road 99.4
house 96.3
land vehicle 90.8
vehicle 88.6
black and white 81.4
old 79.9
wheel 71.8
residential 37
car 28.2
roof 7.2

Face analysis

Amazon

Google

AWS Rekognition

Age 12-20
Gender Male, 99.9%
Calm 95.2%
Confused 1.5%
Disgusted 1.3%
Sad 0.7%
Angry 0.5%
Happy 0.3%
Surprised 0.2%
Fear 0.2%

AWS Rekognition

Age 24-34
Gender Male, 98.7%
Sad 96.8%
Confused 2.3%
Calm 0.7%
Angry 0.1%
Fear 0.1%
Surprised 0%
Happy 0%
Disgusted 0%

AWS Rekognition

Age 19-27
Gender Male, 99.8%
Fear 37.1%
Calm 23.6%
Happy 14.9%
Disgusted 10.1%
Surprised 5.8%
Sad 3.7%
Angry 3.1%
Confused 1.6%

AWS Rekognition

Age 0-4
Gender Male, 97.7%
Happy 60%
Surprised 15.2%
Confused 7.4%
Angry 5.5%
Disgusted 4.3%
Calm 3.4%
Sad 2.5%
Fear 1.9%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Car 99.3%
Person 99%
Wheel 96.6%

Captions

Microsoft

a car parked in front of a house 97.3%
an old car parked in front of a house 96.8%
a vintage car parked in front of a house 95.9%