Human Generated Data

Title

Untitled (woman looking at view of river from balcony of retirement home)

Date

1961

People

Artist: Jack Gould, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16093

Human Generated Data

Title

Untitled (woman looking at view of river from balcony of retirement home)

People

Artist: Jack Gould, American

Date

1961

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-11

Handrail 100
Banister 100
Person 99.8
Human 99.8
Railing 99.7
Person 98.8
Person 98.1
Patio 97.6
Porch 96.2
Person 94
Person 93.8
Person 93.5
Person 92.4
Person 88.9
Person 87.5
Person 80.6
Pergola 80.2
Person 71.1
Person 67.5
Window 65.1
Person 59.5
Outdoors 55.6
Garden 55.6
Arbour 55.6
Person 48.1

Imagga
created on 2022-02-11

structure 34.4
city 31.6
building 28.8
architecture 28.7
sky 23.9
travel 21.1
bridge 20.4
urban 18.3
car 16
balcony 15.8
road 15.4
buildings 15.1
street 14.7
modern 14.7
tower 14.3
night 14.2
sea 14.1
park 13.6
tourism 13.2
deck 12.4
vacation 12.3
town 12.1
water 12
landmark 11.7
transportation 11.7
ocean 11.6
cityscape 11.4
landscape 11.2
house 10.9
barrier 10.5
traffic 10.4
bay 10.4
tourist 9.8
river 9.8
old 9.8
reflection 9.7
skyscraper 9.7
wheeled vehicle 9.6
skyline 9.5
light 9.4
clouds 9.3
exterior 9.2
transport 9.1
office 9.1
window 9.1
sunset 9
new 8.9
sun 8.9
ship 8.8
equipment 8.7
dusk 8.6
business 8.5
trip 8.5
area 8.5
boat 8.4
tract 8.3
passenger car 8.3
patio 8.3
billboard 8.3
center 7.9
obstruction 7.8
device 7.8
highway 7.7
downtown 7.7
outdoor 7.6
tall 7.5
sunrise 7.5
outdoors 7.5
famous 7.4
holiday 7.2
glass 7

Google
created on 2022-02-11

Water 95.9
Sky 95.2
Tree 89.6
Cloud 88
Shade 84.5
Travel 83.6
Rectangle 83.4
Line 81.7
Urban design 81
Tints and shades 77.4
Leisure 76.5
Lake 76.1
Metal 68.2
Horizon 61.8
Tourism 61.3
Landscape 61.1
T-shirt 58.4
Room 58.3
Handrail 55.7
Arch 54.7

Microsoft
created on 2022-02-11

sky 89.8
text 87.8
person 86.9
outdoor 86.4
water 83.6
tree 80.7
lake 79.4
clothing 76
vacation 58.5

Face analysis

Amazon

Google

AWS Rekognition

Age 40-48
Gender Female, 85.6%
Happy 81.5%
Calm 14.8%
Sad 1.9%
Angry 0.6%
Surprised 0.5%
Confused 0.3%
Disgusted 0.3%
Fear 0.2%

AWS Rekognition

Age 6-16
Gender Male, 95.7%
Disgusted 35.7%
Sad 33.8%
Fear 25%
Angry 2.3%
Calm 1.4%
Happy 0.6%
Surprised 0.6%
Confused 0.5%

AWS Rekognition

Age 19-27
Gender Male, 99.8%
Calm 96.8%
Sad 1.3%
Angry 0.7%
Happy 0.4%
Disgusted 0.3%
Confused 0.2%
Fear 0.2%
Surprised 0.1%

AWS Rekognition

Age 37-45
Gender Male, 99.5%
Calm 98.6%
Sad 0.4%
Happy 0.4%
Disgusted 0.2%
Confused 0.1%
Angry 0.1%
Fear 0.1%
Surprised 0.1%

AWS Rekognition

Age 18-26
Gender Male, 94.4%
Calm 97.6%
Sad 0.8%
Fear 0.3%
Happy 0.3%
Angry 0.3%
Confused 0.3%
Disgusted 0.3%
Surprised 0.2%

AWS Rekognition

Age 22-30
Gender Male, 80.3%
Sad 37.3%
Fear 23.1%
Calm 18.7%
Confused 7.7%
Disgusted 5.6%
Surprised 3.5%
Angry 2.8%
Happy 1.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.8%

Captions

Microsoft

a man standing in front of a building 63.6%

Text analysis

Amazon

KODYK
LA
KODYK ٠١٢٣
LA EITA
EITA
٠١٢٣

Google

EXAGON
EXAGON