Human Generated Data

Title

Untitled (group of women in robes descending staircase singing Christmas carols)

Date

1962

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9872

Human Generated Data

Title

Untitled (group of women in robes descending staircase singing Christmas carols)

People

Artist: Martin Schweig, American 20th century

Date

1962

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-28

Handrail 100
Banister 100
Human 98.2
Person 98.2
Railing 97.7
Person 96.3
Person 92.9
Person 91.7
Person 83.7
Person 83.6
Person 79.9
Person 78.3
Person 75.7
Interior Design 75.7
Indoors 75.7
Person 73.3
Staircase 70.1
Person 54

Imagga
created on 2022-01-28

building 55
architecture 46.7
balcony 39
structure 38.9
cinema 36
city 32.4
theater 29
old 18.8
column 18.6
facade 17.4
support 17.3
urban 16.6
bridge 16.6
window 16.4
landmark 15.3
wall 14.7
history 13.4
railing 12.9
step 12.8
light 12.7
church 12
exterior 12
travel 12
columns 11.8
marble 11.5
barrier 11.2
construction 11.1
sculpture 11
stone 11
arch 10.9
house 10.9
tourism 10.7
interior 10.6
modern 10.5
ancient 10.4
sky 10.2
night 9.8
organ 9.6
windows 9.6
hall 9.6
statue 9.5
monument 9.3
inside 9.2
historic 9.2
religion 9
new 8.9
device 8.8
roof 8.6
glass 8.6
capital 8.5
buildings 8.5
art 8.3
stairs 8
river 8
design 7.9
high 7.8
antique 7.8
keyboard instrument 7.8
architectural 7.7
classical 7.6
dark 7.5
famous 7.4
classic 7.4
town 7.4
style 7.4
lights 7.4
indoor 7.3
detail 7.2
tower 7.2
wind instrument 7.1

Google
created on 2022-01-28

Microsoft
created on 2022-01-28

stairs 94.8
text 77.1
black and white 74.3
black 68.4
white 67.6

Face analysis

Amazon

Google

AWS Rekognition

Age 24-34
Gender Male, 80.5%
Calm 90.3%
Sad 5.7%
Confused 1.5%
Happy 0.9%
Disgusted 0.5%
Surprised 0.4%
Angry 0.3%
Fear 0.3%

AWS Rekognition

Age 25-35
Gender Female, 80.1%
Calm 90.7%
Sad 3.2%
Happy 1.8%
Confused 1.3%
Disgusted 1.1%
Angry 0.8%
Surprised 0.7%
Fear 0.5%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Feature analysis

Amazon

Person 98.2%

Captions

Microsoft

a vintage photo of a person 69.1%
a vintage photo of a building 69%
a vintage photo of an old building 65.8%

Text analysis

Amazon

MJI7--YТ3RA°--