Human Generated Data

Title

Untitled (woman looking at view of river from balcony of retirement home)

Date

1961

People

Artist: Jack Gould, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16095.3

Human Generated Data

Title

Untitled (woman looking at view of river from balcony of retirement home)

People

Artist: Jack Gould, American

Date

1961

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-11

Banister 99.9
Handrail 99.9
Person 99.8
Human 99.8
Person 97.6
Person 97
Railing 96.8
Patio 96.6
Person 96.2
Person 95.8
Porch 93.3
Person 91.2
Person 82.2
Person 81.8
Person 80.6
Pergola 78.7
Person 74.1
Person 72.6
Person 67.2
Outdoors 63.5
Window 59
Person 53.7
Person 42

Imagga
created on 2022-02-11

billboard 36.9
structure 35.1
signboard 28.8
building 25.7
city 24.9
sky 22
architecture 19.3
travel 18.3
urban 17.5
bridge 15.5
electronic equipment 14.9
monitor 14.9
night 14.2
buildings 14.2
clouds 12.7
transportation 12.5
equipment 12.5
tourism 12.4
light 12
landscape 11.9
transport 11.9
office 11.6
business 11.5
water 11.3
cockpit 11.1
industry 11.1
landmark 10.8
tower 10.7
glass 10.1
sunset 9.9
steel 9.7
sun 9.7
window 9.6
construction 9.4
house 9.2
car 9.1
road 9
center 8.9
sea 8.6
dusk 8.6
traffic 8.5
bay 8.5
destination 8.4
device 8.4
town 8.3
tourist 8.3
street 8.3
gate 7.9
cloud 7.7
old 7.7
outdoor 7.6
power 7.6
dark 7.5
evening 7.5
ocean 7.5
silhouette 7.4
famous 7.4
technology 7.4
vacation 7.4
coast 7.2
river 7.1

Google
created on 2022-02-11

Sky 94.5
Water 92.4
Cloud 91.6
Shade 84
Tree 83.6
Travel 81.6
Urban design 81.6
Rectangle 78.7
Leisure 78
Tints and shades 77.3
Fence 68.8
Lake 65.8
Glass 62.5
Room 61.3
Metal 60.4
Daylighting 59.7
Arch 58.9
Handrail 57.6
Tourism 55.6
Horizon 52.9

Microsoft
created on 2022-02-11

water 88.9
outdoor 88.5
sky 87.1
person 82.5
tree 79.6
lake 79
text 75.3
vacation 74.9
clothing 73.8
trip 59.2

Face analysis

Amazon

AWS Rekognition

Age 30-40
Gender Female, 72.3%
Calm 68.5%
Surprised 12.7%
Happy 5.8%
Fear 4.5%
Sad 3.9%
Angry 3.1%
Disgusted 1.2%
Confused 0.3%

AWS Rekognition

Age 29-39
Gender Male, 78.7%
Sad 71.7%
Calm 26.6%
Confused 0.4%
Fear 0.3%
Surprised 0.3%
Angry 0.3%
Disgusted 0.2%
Happy 0.1%

AWS Rekognition

Age 28-38
Gender Male, 90.5%
Fear 82.4%
Sad 9.4%
Confused 2.7%
Calm 1.9%
Surprised 1.5%
Disgusted 1.2%
Angry 0.5%
Happy 0.4%

AWS Rekognition

Age 40-48
Gender Male, 99.8%
Calm 80.1%
Sad 7.3%
Happy 4.6%
Disgusted 2.8%
Fear 2%
Angry 1.4%
Confused 1.1%
Surprised 0.7%

AWS Rekognition

Age 27-37
Gender Male, 65.3%
Calm 83.4%
Happy 7.9%
Fear 3.5%
Sad 2.6%
Disgusted 1.2%
Confused 0.6%
Angry 0.6%
Surprised 0.4%

AWS Rekognition

Age 14-22
Gender Male, 99.3%
Angry 64.8%
Calm 24.2%
Sad 8.2%
Confused 1.2%
Happy 0.7%
Surprised 0.5%
Disgusted 0.3%
Fear 0.2%

AWS Rekognition

Age 23-31
Gender Male, 81%
Fear 47.6%
Calm 24.8%
Sad 13.6%
Confused 5.2%
Disgusted 4.4%
Surprised 2.1%
Happy 1.3%
Angry 1.1%

Feature analysis

Amazon

Person 99.8%

Captions

Microsoft

a group of people walking on a bridge 58.7%
a group of people walking across a bridge 58.4%
a group of people on a bridge 46.2%

Text analysis

Amazon

7E
.....
MACOM
فة ..... KIDA
فة
KIDA

Google

XACOX
7E XACOX
7E