Human Generated Data

Title

Untitled (man tending garden outside of house)

Date

1946

People

Artist: Jack Gould, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.15991

Human Generated Data

Title

Untitled (man tending garden outside of house)

People

Artist: Jack Gould, American

Date

1946

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-25

Human 98.3
Person 98.3
Outdoors 91.6
Apparel 80.1
Clothing 80.1
Home Decor 76.8
Automobile 72.5
Car 72.5
Vehicle 72.5
Transportation 72.5
Garden 70.5
Worker 65.4
Face 65.1
Standing 62.9
Path 62.4
Nature 58.6
Handrail 57.2
Banister 57.2
Gardening 55.9
Plant 55.4
Window 55

Imagga
created on 2022-03-25

barbershop 57.4
shop 44.9
mercantile establishment 34.8
place of business 23.4
structure 21.9
interior 20.3
window 19.8
room 17.1
modern 16.8
people 16.7
architecture 16.6
man 15.4
building 14.3
house 14.2
door 13.7
mobile home 13.2
furniture 13.1
sliding door 12.8
industry 12.8
chair 12.7
barrier 11.9
establishment 11.7
glass 11.7
transportation 11.6
indoors 11.4
wall 11.2
light 10.7
travel 10.6
trailer 10.5
housing 10.4
adult 10.3
office 10.2
work 10.2
wood 10
wheeled vehicle 10
person 9.9
groom 9.8
table 9.6
couple 9.6
home 9.6
sky 9.6
construction 9.4
water 9.3
wedding 9.2
city 9.1
transport 9.1
indoor 9.1
business 9.1
industrial 9.1
bride 8.7
love 8.7
life 8.6
sitting 8.6
male 8.5
old 8.4
inside 8.3
street 8.3
movable barrier 8.2
steel 8.1
vehicle 8
machine 7.9
design 7.9
urban 7.9
day 7.8
hospital 7.7
station 7.7
luxury 7.7
marriage 7.6
obstruction 7.5
outdoors 7.5
lifestyle 7.2
dress 7.2
decor 7.1
working 7.1
happiness 7
equipment 7
sea 7
wooden 7

Google
created on 2022-03-25

Microsoft
created on 2022-03-25

man 95.3
black and white 94.9
outdoor 87.1
text 86.8
standing 79.6
person 77.7
clothing 73.7
vehicle 61.4
monochrome 60.7
car 59

Face analysis

Amazon

Google

AWS Rekognition

Age 29-39
Gender Male, 100%
Calm 98.4%
Confused 1.1%
Sad 0.4%
Angry 0%
Disgusted 0%
Happy 0%
Surprised 0%
Fear 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.3%
Car 72.5%

Captions

Microsoft

a man standing in front of a window 86.5%
a man standing next to a window 84.6%
a man standing in front of a building 84.5%