Human Generated Data

Title

Untitled (large party, mirrored wall, seen from above)

Date

1952

People

Artist: Peter James Studio, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.20175

Human Generated Data

Title

Untitled (large party, mirrored wall, seen from above)

People

Artist: Peter James Studio, American

Date

1952

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.20175

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Interior Design 99.4
Indoors 99.4
Room 95.4
Person 95.1
Human 95.1
Person 92.1
Person 91.3
Person 90.3
Person 89
Person 78
Person 72.4
Restaurant 72.4
Meal 71.9
Food 71.9
Person 71.1
Person 70.1
Person 67.3
Crowd 66.2
Person 65.8
Person 61.9
Building 60.3
Person 60
Cafeteria 58.2
People 58
Classroom 57.3
School 57.3
Urban 57
Audience 56.8
Person 51.8
Person 47.2
Person 44.1
Person 42.3

Clarifai
created on 2023-10-22

people 99.9
group 99.9
many 99.5
education 98.5
child 98.3
group together 97.5
school 97.2
man 95.9
adult 95.5
woman 95.1
teacher 93.8
indoors 93.6
room 93.5
elementary school 93.4
classroom 93.2
crowd 92.9
boy 92.4
dancing 91.6
music 90
several 87.3

Imagga
created on 2022-03-05

grunge 47.6
old 40.4
vintage 35.5
texture 34
antique 33.1
negative 32.1
film 30.8
aged 28.9
retro 27.8
frame 27.6
grungy 27.5
dirty 25.3
wall 24.6
art 21.2
room 21.2
design 20.8
pattern 20.5
space 20.2
paint 19.9
brown 19.1
rough 19.1
damaged 19.1
building 18.6
ancient 18.1
border 18.1
material 17.9
graphic 16.8
paper 16.5
decoration 16.4
classroom 16.4
text 15.7
edge 15.4
photographic paper 14.9
wallpaper 14.5
artistic 13.9
black 13.8
empty 13.7
structure 13.6
rust 13.5
style 13.3
weathered 13.3
architecture 13.3
decay 12.5
scratches 11.8
scratch 11.7
screen 11.6
messy 11.6
dirt 11.5
worn 11.4
textured 11.4
digital 11.3
frames 10.7
noise 10.7
slide 10.7
drawing 10.7
detailed 10.6
stain 10.6
mask 10.5
canvas 10.4
flower 10
photographic equipment 9.9
noisy 9.9
photographic 9.8
your 9.7
collage 9.6
computer 9.6
aging 9.6
blank 9.4
floral 9.4
plants 9.3
grain 9.2
decorative 9.2
backdrop 9.1
color 8.9
designed 8.9
layered 8.8
mess 8.8
blackboard 8.8
newspaper 8.8
movie 8.7
crack 8.7
layer 8.7
light 8.7
obsolete 8.6
old fashioned 8.6
poster 8.5
dark 8.3
city 8.3
letter 8.2
surface 7.9
highly 7.9
distressed 7.9
grime 7.8
stains 7.8
strip 7.8
modern 7.7
construction 7.7
stained 7.7
concrete 7.7
card 7.6
textures 7.6
yellow 7.3
door 7.1
creation 7.1
decor 7.1

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

text 95.1
person 85.2
white 75.3
black 72.1

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 29-39
Gender Male, 91.6%
Sad 55.6%
Calm 39.4%
Confused 1.9%
Disgusted 1%
Angry 0.9%
Happy 0.7%
Surprised 0.3%
Fear 0.2%

AWS Rekognition

Age 2-10
Gender Female, 56.6%
Calm 35.4%
Sad 30.7%
Happy 14%
Surprised 4.9%
Fear 4.8%
Confused 4.6%
Angry 3.4%
Disgusted 2.3%

AWS Rekognition

Age 23-31
Gender Female, 53%
Calm 69.5%
Sad 26.4%
Happy 1.1%
Disgusted 0.9%
Fear 0.7%
Surprised 0.6%
Angry 0.6%
Confused 0.3%

AWS Rekognition

Age 27-37
Gender Male, 99%
Calm 42.9%
Sad 21.1%
Angry 19.1%
Disgusted 7.3%
Happy 3.9%
Confused 3.5%
Surprised 1.2%
Fear 1.1%

AWS Rekognition

Age 6-14
Gender Male, 84.3%
Calm 99.4%
Sad 0.3%
Angry 0.1%
Happy 0.1%
Disgusted 0%
Fear 0%
Confused 0%
Surprised 0%

AWS Rekognition

Age 25-35
Gender Female, 68.8%
Calm 33.7%
Happy 29.4%
Sad 13.1%
Fear 10.4%
Confused 4.3%
Angry 4.2%
Surprised 2.9%
Disgusted 2.1%

AWS Rekognition

Age 25-35
Gender Male, 92.9%
Calm 55.8%
Angry 27.8%
Sad 4.8%
Disgusted 4%
Surprised 3.9%
Fear 1.5%
Happy 1.3%
Confused 0.7%

AWS Rekognition

Age 26-36
Gender Male, 94.4%
Calm 77.9%
Sad 20.2%
Confused 1.1%
Happy 0.4%
Angry 0.2%
Disgusted 0.1%
Fear 0.1%
Surprised 0.1%

AWS Rekognition

Age 19-27
Gender Male, 90.2%
Calm 83.6%
Disgusted 7.6%
Sad 4.4%
Fear 1.6%
Happy 1.2%
Angry 0.8%
Confused 0.4%
Surprised 0.3%

AWS Rekognition

Age 18-24
Gender Female, 68.8%
Calm 75.6%
Sad 22.2%
Happy 0.6%
Confused 0.4%
Angry 0.4%
Fear 0.3%
Disgusted 0.3%
Surprised 0.2%

AWS Rekognition

Age 20-28
Gender Male, 50.2%
Calm 52.6%
Happy 17.8%
Sad 11.7%
Fear 7.7%
Angry 6.3%
Disgusted 1.7%
Confused 1.4%
Surprised 0.8%

AWS Rekognition

Age 19-27
Gender Female, 92.6%
Sad 54%
Happy 34.7%
Disgusted 3.9%
Fear 1.7%
Calm 1.6%
Surprised 1.5%
Confused 1.4%
Angry 1.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Person 95.1%
Person 92.1%
Person 91.3%
Person 90.3%
Person 89%
Person 78%
Person 72.4%
Person 71.1%
Person 70.1%
Person 67.3%
Person 65.8%
Person 61.9%
Person 60%
Person 51.8%
Person 47.2%
Person 44.1%
Person 42.3%

Categories

Text analysis

Amazon

EAD
KODAK-.VEELA

Google

YT37A°2-XAGO
YT37A°2-XAGO