Human Generated Data

Title

Untitled (overhead view of audience and performance on stage with several actors in blackface)

Date

c. 1955

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9542

Human Generated Data

Title

Untitled (overhead view of audience and performance on stage with several actors in blackface)

People

Artist: Martin Schweig, American 20th century

Date

c. 1955

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-28

Indoors 99.5
Interior Design 99.5
Room 97.1
Human 89.1
Person 89.1
Person 86.2
Person 83.8
Person 83.2
Person 79.9
Living Room 69.2
Food 68.8
Meal 68.8
Hall 68.8
Person 67.7
Cafeteria 65.6
Restaurant 65.6
Furniture 61.3
People 60.6
Stage 59.6
Crowd 58.2
Theater 58.2
Person 56.5
Auditorium 56.2
Classroom 55.9
School 55.9
Court 55.1

Imagga
created on 2022-01-28

shelf 43.5
case 43
interior 42.4
house 36.7
room 33.6
window 31.5
home 31.1
shop 30.8
architecture 30.4
table 28.2
decor 26.5
furniture 23.6
modern 21.7
mercantile establishment 21.7
design 19.1
luxury 18.9
shoe shop 17.8
decoration 17.6
inside 17.5
apartment 17.2
glass 16.9
building 16.9
wall 16.6
kitchen 16.2
wood 15.8
indoor 14.6
place of business 14.4
residential 14.3
style 14.1
light 14
elegant 13.7
chair 13.2
comfortable 12.4
lamp 12.4
indoors 12.3
structure 12.1
floor 12.1
city 11.6
living 11.4
flowers 11.3
cabinet 11.1
toyshop 10.6
travel 10.6
urban 10.5
dining 10.5
elegance 10.1
frame 10
domestic 9.9
restaurant 9.8
old 9.7
estate 9.5
contemporary 9.4
3d 9.3
classic 9.3
door 8.9
chairs 8.8
render 8.6
expensive 8.6
hotel 8.6
real 8.5
art 8.2
new 8.1
balcony 8
chandelier 7.9
mansion 7.8
flower 7.7
sofa 7.6
rich 7.4
hall 7.3
food 7.2
establishment 7.1

Google
created on 2022-01-28

Microsoft
created on 2022-01-28

text 91.4
table 72.9
white 72.7

Face analysis

Amazon

AWS Rekognition

Age 19-27
Gender Female, 70.1%
Sad 98.2%
Calm 0.7%
Fear 0.4%
Disgusted 0.2%
Angry 0.2%
Happy 0.1%
Confused 0.1%
Surprised 0.1%

AWS Rekognition

Age 20-28
Gender Male, 94.3%
Sad 75.9%
Calm 20.4%
Happy 1.7%
Angry 0.6%
Disgusted 0.5%
Fear 0.4%
Confused 0.3%
Surprised 0.2%

AWS Rekognition

Age 21-29
Gender Male, 56.6%
Calm 57.8%
Sad 17.2%
Angry 8.9%
Happy 7.7%
Surprised 2.4%
Fear 2.2%
Confused 1.9%
Disgusted 1.8%

AWS Rekognition

Age 18-26
Gender Female, 75.3%
Happy 60.6%
Calm 28.4%
Sad 5.3%
Confused 2.4%
Angry 1.4%
Fear 0.7%
Surprised 0.6%
Disgusted 0.6%

AWS Rekognition

Age 13-21
Gender Male, 58%
Fear 79.6%
Sad 7.2%
Disgusted 7%
Calm 3.1%
Surprised 1%
Angry 0.9%
Happy 0.8%
Confused 0.3%

AWS Rekognition

Age 19-27
Gender Male, 77.9%
Calm 76%
Sad 10.9%
Angry 4.9%
Confused 4.6%
Happy 1.3%
Disgusted 1.1%
Fear 0.7%
Surprised 0.6%

AWS Rekognition

Age 18-24
Gender Male, 98.7%
Calm 54.4%
Happy 18.3%
Confused 7.8%
Sad 7.6%
Angry 4.4%
Disgusted 3.7%
Surprised 2.1%
Fear 1.7%

AWS Rekognition

Age 18-24
Gender Male, 94.5%
Calm 54.9%
Sad 30.4%
Angry 4.8%
Happy 3.8%
Confused 2.4%
Fear 1.6%
Disgusted 1.1%
Surprised 1%

AWS Rekognition

Age 12-20
Gender Male, 94.1%
Calm 76.7%
Sad 10.2%
Confused 8.9%
Angry 1.9%
Happy 0.7%
Disgusted 0.6%
Surprised 0.6%
Fear 0.5%

Feature analysis

Amazon

Person 89.1%

Captions

Microsoft

a group of people in a room 75.3%
a group of people standing in a room 66.3%
a group of people posing for a photo 49.7%

Text analysis

Google

XAGON
YT37A°2
--
MJI7-- YT37A°2 -- XAGON
MJI7--