Human Generated Data

Title

Untitled (woman looking at view of river from balcony of retirement home)

Date

1961

People

Artist: Jack Gould, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16092.1

Human Generated Data

Title

Untitled (woman looking at view of river from balcony of retirement home)

People

Artist: Jack Gould, American

Date

1961

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-11

Human 99.8
Person 99.8
Patio 99.2
Porch 99
Person 97.9
Person 97.3
Handrail 96.5
Banister 96.5
Pergola 96
Person 95.9
Person 95.6
Person 94.8
Person 93
Railing 92.6
Person 87.8
Person 78.6
Person 71.5
Person 71.3
Outdoors 65.1
Person 62.1
Person 57.5
Arbour 56.7
Garden 56.7

Imagga
created on 2022-02-11

structure 36.6
bridge 35.3
city 33.2
architecture 29.1
balcony 28.7
travel 26.8
building 24.7
urban 24.5
sky 22.4
water 18.7
transportation 17.9
landmark 17.1
ocean 16.6
vacation 16.4
patio 15.9
sea 15.6
modern 15.4
tourism 14.8
area 14.7
tower 14.3
deck 14.1
window 13.5
river 13.3
road 12.6
new 12.1
business 12.1
support 11.9
tourist 11.2
landscape 11.2
transport 11
traffic 10.4
cityscape 10.4
suspension bridge 10.4
street 10.1
light 10
device 10
car 9.9
downtown 9.6
buildings 9.5
trip 9.4
glass 9.3
famous 9.3
barrier 9.3
town 9.3
island 9.2
people 8.9
ship 8.8
highway 8.7
skyscraper 8.6
skyline 8.5
outdoor 8.4
summer 8.4
house 8.4
passenger 8.3
vehicle 8.3
door 8.1
gate 8
interior 8
steel 8
step 7.9
holiday 7.9
station 7.8
port 7.7
cable 7.6
beach 7.6
bay 7.5
outdoors 7.5
boat 7.4
park 7.4
historic 7.3
reflection 7.3
office 7.3
coast 7.2
train 7.2
pier 7.2
trees 7.1
conveyance 7.1

Google
created on 2022-02-11

Sky 95.7
Water 94.4
Black 89.5
Cloud 87.6
Plant 86
Tree 83.7
Shade 81.4
Travel 80.2
Urban design 78.9
Building 78.5
Tints and shades 77.2
Rectangle 76.1
Leisure 75.2
Lake 65
Metal 64.5
Landscape 60.1
Room 58.9
Roof 56.8
Apartment 56.3
T-shirt 55.5

Microsoft
created on 2022-02-11

outdoor 95.3
sky 88.1
water 85.2
person 84.7
clothing 84.7
building 83
lake 82.7
text 77.5
vacation 75.6
tree 54.5
cloud 50.4

Face analysis

Amazon

Google

AWS Rekognition

Age 29-39
Gender Male, 97.2%
Calm 96%
Sad 1.7%
Confused 0.8%
Surprised 0.5%
Disgusted 0.4%
Angry 0.3%
Happy 0.2%
Fear 0.2%

AWS Rekognition

Age 45-51
Gender Female, 55.4%
Sad 87.1%
Calm 11.8%
Happy 0.3%
Confused 0.2%
Disgusted 0.2%
Fear 0.2%
Angry 0.1%
Surprised 0.1%

AWS Rekognition

Age 13-21
Gender Female, 88%
Fear 61.1%
Calm 29.9%
Sad 4.6%
Angry 1.4%
Happy 1.2%
Surprised 0.8%
Disgusted 0.6%
Confused 0.4%

AWS Rekognition

Age 16-24
Gender Male, 96.7%
Calm 65.5%
Sad 31.9%
Fear 0.7%
Confused 0.5%
Disgusted 0.5%
Angry 0.4%
Surprised 0.3%
Happy 0.2%

AWS Rekognition

Age 23-31
Gender Male, 98.6%
Calm 99.2%
Happy 0.4%
Surprised 0.1%
Confused 0.1%
Disgusted 0.1%
Sad 0%
Angry 0%
Fear 0%

AWS Rekognition

Age 27-37
Gender Male, 99.5%
Sad 95.1%
Calm 4.2%
Confused 0.2%
Angry 0.1%
Fear 0.1%
Happy 0.1%
Surprised 0.1%
Disgusted 0.1%

AWS Rekognition

Age 24-34
Gender Male, 99.2%
Calm 67.2%
Sad 27.4%
Happy 1.8%
Fear 1%
Angry 1%
Confused 0.6%
Disgusted 0.5%
Surprised 0.4%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.8%

Captions

Microsoft

a man and a woman standing in front of a building 70.9%
a man and a woman standing in front of a bridge 46.1%
a person standing in front of a building 46%

Text analysis

Amazon

KODYK
CELA
Car CELA EIT.
Car
!!
EIT.

Google

WACOM
WACOM