Human Generated Data

Title

Untitled (four young girls in fancy dresses standing on staircase)

Date

1945-1970

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10324

Human Generated Data

Title

Untitled (four young girls in fancy dresses standing on staircase)

People

Artist: Martin Schweig, American 20th century

Date

1945-1970

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-29

Handrail 100
Banister 100
Railing 99.8
Person 94
Human 94
Staircase 85.3

Imagga
created on 2022-01-29

support 77.7
baluster 61.6
barrier 38.2
step 36.5
bridge 32.1
structure 30.7
obstruction 27.4
architecture 27.3
device 24.2
city 23.3
travel 19.7
building 19
railing 17.9
water 17.3
ocean 15.8
pier 15.7
sky 15.3
river 15.1
urban 14.8
tourism 14.8
balcony 14.2
landmark 13.5
house 13.4
sea 13.3
road 11.7
evening 11.2
construction 11.1
dock 10.7
landscape 10.4
exterior 10.1
wood 10
tower 9.8
modern 9.8
wooden 9.7
entrance 9.7
beach 9.3
outdoor 9.2
coast 9
home 8.8
light 8.7
day 8.6
stairs 8.4
transport 8.2
deck 8.1
metal 8
night 8
scene 7.8
clouds 7.6
cityscape 7.6
bay 7.5
outdoors 7.5
to 7.1
interior 7.1
summer 7.1
scenic 7

Google
created on 2022-01-29

Stairs 85.5
Rectangle 83.2
Font 79.7
Slope 79.3
Tree 76.9
Parallel 76.1
Tints and shades 72.1
Wood 72
Pattern 69.5
Handrail 68.1
Baluster 67.2
Room 61.4
Plant 59.9
Metal 57.7
Art 57.5
Monochrome 57
Urban design 56.6
Monochrome photography 56.2
Illustration 56.1
Visual arts 50.4

Microsoft
created on 2022-01-29

text 86.1
black and white 81
stairs 79.3
step 39.5
stair 27.2
net 18.7

Face analysis

Amazon

AWS Rekognition

Age 33-41
Gender Female, 87.3%
Happy 98.9%
Sad 0.3%
Fear 0.2%
Calm 0.2%
Surprised 0.2%
Disgusted 0.1%
Angry 0.1%
Confused 0.1%

AWS Rekognition

Age 43-51
Gender Female, 91.2%
Happy 75.4%
Calm 9.8%
Surprised 5.2%
Sad 3.7%
Fear 2.2%
Disgusted 1.9%
Angry 1.1%
Confused 0.8%

Feature analysis

Amazon

Person 94%
Staircase 85.3%

Captions

Microsoft

a group of people standing next to a fence 79.6%
a group of people standing in front of a fence 77.7%
a group of people standing by a fence 77.4%

Text analysis

Amazon

20051
KODAK--2A1TW

Google

37A°2
-
XAGON
MJI7-- YT 37A°2 - - XAGON
MJI7--
YT