Human Generated Data

Title

Untitled (view from below of nine children posing on outdoor patio with man watching above)

Date

1961

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9827

Human Generated Data

Title

Untitled (view from below of nine children posing on outdoor patio with man watching above)

People

Artist: Martin Schweig, American 20th century

Date

1961

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-28

Person 97.5
Human 97.5
Person 96.5
Person 96.4
Person 96.2
Person 94.5
Apparel 81.6
Clothing 81.6
Crowd 80.8
Person 78.7
Face 71.6
Nature 67.6
Outdoors 67.4
People 60.6
Water 58.4
Ocean 58.4
Sea 58.4
Female 55.9

Imagga
created on 2022-01-28

cinema 100
theater 100
building 95.4
structure 60.5
architecture 36.5
facade 21.6
history 21.5
old 20.9
city 20.8
church 20.3
famous 17.7
landmark 17.2
exterior 15.7
travel 15.5
capital 15.2
tourism 14.9
sculpture 14.3
house 13.4
fountain 13
sky 12.8
religion 12.5
roof 12.4
balcony 12.3
monument 12.1
park 12.1
historic 11.9
town 11.1
snow 11
ancient 10.4
destination 10.3
window 10.2
statue 9.6
cathedral 9.6
cross 9.4
historical 9.4
wall 9.4
winter 9.4
art 9.1
tower 9
detail 8.9
culture 8.5
tract 8.4
tourist 7.3

Google
created on 2022-01-28

Black 89.8
Line 81.7
Font 81.7
Art 81.4
Building 76.7
Monochrome 74
Rectangle 73.3
Monochrome photography 71.5
Painting 71
Visual arts 69.2
Facade 67.5
Symmetry 67.4
Arch 67.4
Event 65.7
Holy places 64.6
Room 64.5
Classic 64.4
Illustration 64.3
Stock photography 62.3
History 61.4

Microsoft
created on 2022-01-28

tree 97.8
text 94.9
black and white 76.7
old 43.2

Face analysis

Amazon

AWS Rekognition

Age 36-44
Gender Male, 91.8%
Happy 64.9%
Calm 14%
Sad 6.6%
Confused 5.1%
Surprised 3.4%
Disgusted 2.6%
Angry 2.2%
Fear 1.2%

AWS Rekognition

Age 45-51
Gender Male, 94.5%
Calm 77.9%
Sad 20.9%
Confused 0.3%
Happy 0.3%
Fear 0.2%
Surprised 0.1%
Angry 0.1%
Disgusted 0.1%

AWS Rekognition

Age 23-31
Gender Female, 76.7%
Sad 82.3%
Calm 8.9%
Happy 4.5%
Surprised 1.2%
Confused 1.1%
Disgusted 0.8%
Angry 0.7%
Fear 0.5%

Feature analysis

Amazon

Person 97.5%

Captions

Microsoft

an old photo of a train 35.9%
an old photo of a train window 35.8%
old photo of a train 30.4%

Text analysis

Amazon

L
H
L H PICR
PICR