Human Generated Data

Title

Untitled (woman looking at view of river from balcony of retirement home)

Date

1961

People

Artist: Jack Gould, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16094.2

Human Generated Data

Title

Untitled (woman looking at view of river from balcony of retirement home)

People

Artist: Jack Gould, American

Date

1961

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-11

Human 99.7
Person 99.7
Patio 99.4
Person 98.8
Person 98.6
Handrail 98.6
Banister 98.6
Porch 98.5
Person 98.3
Railing 97.6
Person 96.3
Person 94.9
Pergola 93.1
Person 92.7
Person 88.9
Person 86.9
Person 84.1
Outdoors 68.4
Person 61.7
Arbour 58.3
Garden 58.3
Person 50.9
Person 50.3
Person 42.9

Imagga
created on 2022-02-11

structure 36.7
building 30
architecture 29.4
city 29.1
urban 25.3
car 23.7
sky 21.2
travel 21.1
bridge 20.5
transportation 19.7
road 17.2
street 16.6
billboard 16.4
wheeled vehicle 16.3
passenger car 16.3
tourism 15.7
vehicle 15.6
house 15
balcony 14.3
bay 14.2
sea 14.1
modern 14
water 14
shuttle bus 13.6
traffic 13.3
signboard 12.9
clouds 12.7
landscape 12.6
ocean 12.4
exterior 12
transport 11.9
office 11.8
highway 11.6
conveyance 11.4
area 11.3
business 10.9
shuttle 10.9
vacation 10.6
home 10.4
industry 10.2
day 10.2
public transport 10.1
outdoor 9.9
landmark 9.9
buildings 9.5
patio 9.2
outdoors 9
reflection 8.9
tourist 8.9
light 8.7
scene 8.7
window 8.6
glass 8.6
tree 8.5
horizontal 8.4
old 8.4
town 8.3
truck 8.1
metal 8
steel 8
gate 7.9
park 7.8
high 7.8
housing 7.8
construction 7.7
downtown 7.7
door 7.7
concrete 7.7
cityscape 7.6
island 7.3
trailer 7.3
horizon 7.2
coast 7.2
passenger 7.2
holiday 7.2

Google
created on 2022-02-11

Sky 95
Water 93.3
Building 89.2
Shade 86.8
Cloud 86.7
Plant 86.6
Tree 82
Travel 80.8
Rectangle 80.2
Morning 79.2
Urban design 78.8
Tints and shades 77.4
Lake 74.1
Leisure 72
T-shirt 69.7
Metal 68.3
Fence 65.8
Room 65.1
Chair 65
Landscape 62.1

Microsoft
created on 2022-02-11

person 90.2
sky 86.9
outdoor 85.9
tree 85.4
clothing 84.5
text 71.3
water 60
vacation 54.8
trip 54.1
lake 52.4

Face analysis

Amazon

Google

AWS Rekognition

Age 43-51
Gender Female, 76.5%
Calm 56.9%
Happy 32.6%
Sad 4.6%
Surprised 1.7%
Angry 1.3%
Confused 1.2%
Disgusted 1%
Fear 0.7%

AWS Rekognition

Age 6-12
Gender Male, 97.9%
Fear 48%
Sad 34.5%
Calm 5.3%
Angry 4.8%
Disgusted 3%
Happy 2.3%
Surprised 1.3%
Confused 0.8%

AWS Rekognition

Age 23-31
Gender Male, 99.1%
Calm 57.6%
Happy 16.6%
Sad 13.2%
Fear 3.8%
Angry 2.8%
Disgusted 2.1%
Surprised 2%
Confused 1.9%

AWS Rekognition

Age 29-39
Gender Male, 90.8%
Sad 96.1%
Disgusted 2%
Calm 0.6%
Fear 0.3%
Happy 0.3%
Angry 0.2%
Confused 0.2%
Surprised 0.1%

AWS Rekognition

Age 18-26
Gender Male, 99.9%
Calm 75.3%
Sad 21.1%
Confused 2.2%
Angry 0.4%
Happy 0.3%
Surprised 0.3%
Disgusted 0.3%
Fear 0.2%

AWS Rekognition

Age 50-58
Gender Male, 99.7%
Sad 54.7%
Calm 33.9%
Confused 3.9%
Disgusted 3.9%
Happy 2%
Angry 0.8%
Surprised 0.5%
Fear 0.4%

AWS Rekognition

Age 22-30
Gender Female, 52.3%
Calm 78.9%
Sad 15.2%
Confused 1.6%
Fear 1.4%
Happy 1%
Disgusted 0.7%
Surprised 0.6%
Angry 0.6%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%

Captions

Microsoft

a group of people walking on a bridge 54.5%
a group of people walking across a bridge 54.1%
a group of people on a bridge 44.3%

Text analysis

Amazon

2
KODYK
EIRN

Google

YT37
XACON
MJIF YT37 A2 XACON
MJIF
A2