Human Generated Data

Title

Untitled (two girls dancing on stage while others sit behind them)

Date

1948

People

Artist: Harry Annas, American 1897 - 1980

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.2950

Human Generated Data

Title

Untitled (two girls dancing on stage while others sit behind them)

People

Artist: Harry Annas, American 1897 - 1980

Date

1948

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.2950

Machine Generated Data

Tags

Amazon
created on 2022-01-21

Person 93.1
Human 93.1
Stage 92.5
Person 89.5
Person 86.7
Person 79.6
Person 79.5
Person 77.3
Clothing 74.9
Apparel 74.9
Indoors 73.8
Person 70.4
Room 67.9
People 65.8
Person 65.4
Person 65.1
Person 60.7
Helmet 59.4
Screen 59.1
Electronics 59.1

Clarifai
created on 2023-10-26

people 98.3
man 93.1
indoors 92.9
monochrome 91.7
adult 89.8
no person 89.2
audience 88.8
musician 88.2
music 87.1
furniture 86.4
room 86.2
auditorium 85.3
stage 84.9
leader 83.3
family 82.4
group 80.8
architecture 80.4
group together 79.6
woman 78.2
wood 77.4

Imagga
created on 2022-01-21

theater curtain 56.5
curtain 48.7
blind 34.7
amplifier 28.9
interior 27.4
electronic equipment 27
home 23.9
protective covering 23.6
equipment 23.6
wood 23.3
design 23
house 21.7
architecture 17.9
kitchen 17.9
room 17.5
freight car 17.3
wall 16.2
modern 16.1
decor 15.9
car 15.6
style 15.6
old 15.3
building 15
steel 15
furniture 15
vintage 14
radio 14
decoration 13.7
luxury 13.7
art 13.3
light 12.7
antique 12.6
brown 12.5
apartment 12.4
stove 12.4
theater 12
counter 12
structure 11.9
covering 11.8
retro 11.5
wooden 11.4
floor 11.1
classic 11.1
inside 11
sink 10.8
wheeled vehicle 10.5
empty 10.3
pattern 10.2
window 10.1
oven 10
domestic 9.9
new 9.7
metal 9.6
contemporary 9.4
grunge 9.4
music 9.2
frame 9.1
cooking 8.7
golden 8.6
lamp 8.6
glass 8.5
cinema 8.5
3d 8.5
box 8.4
texture 8.3
city 8.3
cook 8.2
gold 8.2
indoor 8.2
set 8.2
border 8.1
receiver 8.1
symbol 8.1
home theater 8.1
graphic 8
cabinet 7.9
black 7.8
tile 7.8
radio receiver 7.7
elegant 7.7
audio 7.6
vehicle 7.6
hotel 7.6
shelf 7.5
technology 7.4
lights 7.4
entertainment 7.4
historic 7.3
night 7.1
blackboard 7.1
indoors 7

Google
created on 2022-01-21

Rectangle 85.7
Font 81.6
Art 74.5
Monochrome photography 71.2
Facade 70.9
Monochrome 68.4
Event 67.7
Cabinetry 65.4
Curtain 64.7
Glass 63.6
Display device 63.5
Room 62.7
Audio equipment 60.5
Display case 60.1
Metal 55.9

Microsoft
created on 2022-01-21

black and white 83.3
text 63.1

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 24-34
Gender Male, 88.3%
Calm 55%
Sad 32.3%
Confused 5.8%
Angry 2.1%
Surprised 1.2%
Fear 1.2%
Disgusted 1.2%
Happy 1.1%

AWS Rekognition

Age 21-29
Gender Female, 93.1%
Calm 83.7%
Sad 6.2%
Disgusted 2.4%
Confused 2.3%
Happy 2%
Fear 1.5%
Angry 1.1%
Surprised 0.8%

AWS Rekognition

Age 20-28
Gender Male, 97.6%
Sad 50.7%
Happy 18.9%
Confused 12.4%
Calm 9.8%
Fear 5.7%
Disgusted 1.5%
Angry 0.7%
Surprised 0.4%

AWS Rekognition

Age 33-41
Gender Male, 94.6%
Disgusted 43%
Sad 21.2%
Calm 11.2%
Confused 7.4%
Angry 7.3%
Happy 6.5%
Fear 1.9%
Surprised 1.5%

AWS Rekognition

Age 26-36
Gender Male, 99.3%
Calm 79.8%
Sad 13%
Confused 1.9%
Happy 1.5%
Fear 1.2%
Disgusted 0.9%
Surprised 0.9%
Angry 0.7%

Feature analysis

Amazon

Person 93.1%

Categories

Text analysis

Amazon

R