Human Generated Data

Title

Untitled (large party, seen from event)

Date

1952

People

Artist: Peter James Studio, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.20174

Human Generated Data

Title

Untitled (large party, seen from event)

People

Artist: Peter James Studio, American

Date

1952

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.20174

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Interior Design 97.8
Indoors 97.8
Human 96
Room 94.5
Person 93.1
Person 92.9
Person 84.5
Furniture 83.3
Person 83
Person 79.9
Person 77.7
People 72
Clothing 71.7
Apparel 71.7
Crowd 71
Person 65.3
Drawing 65.3
Art 65.3
Person 63.5
Person 62.7
Floor 61.1
Person 60.3
Person 59.2
Hall 58.1
Chair 57.2
Photography 56.2
Photo 56.2
Person 48.6
Person 45.2
Person 42.3

Clarifai
created on 2023-10-22

group 99.7
many 99.7
people 99.5
crowd 98
man 95.6
group together 94.2
woman 93.4
adult 91.9
leader 91.9
indoors 89.8
no person 87.2
audience 86.5
administration 85.1
illustration 83
dancing 81.8
child 81.8
several 81.5
education 81.1
home 80.7
music 80.5

Imagga
created on 2022-03-05

negative 95.1
film 73.5
photographic paper 53.8
grunge 36.6
photographic equipment 35.9
old 25.1
texture 24.3
vintage 21.5
art 20.8
antique 19.2
dirty 19
paint 18.1
grungy 18
pattern 17.8
retro 17.2
black 16.8
frame 15.8
design 15.7
aged 15.4
border 14.5
city 14.1
silhouette 14.1
newspaper 13.8
rough 13.7
wallpaper 13
architecture 12.5
drawing 11.9
decoration 11.9
graphic 11.7
space 11.6
material 11.6
weathered 11.4
text 11.3
building 11.3
rock 11.3
creation 10.9
product 10.6
urban 10.5
canvas 10.4
wall 10.4
digital 9.7
surface 9.7
style 9.6
rust 9.6
crowd 9.6
snow 9.5
travel 9.1
backdrop 9.1
structure 9
color 8.9
textured 8.8
light 8.7
artistic 8.7
splash 8.7
edge 8.6
worn 8.6
business 8.5
graffito 7.9
people 7.8
scratch 7.8
ancient 7.8
messy 7.7
collage 7.7
stained 7.7
stain 7.7
aging 7.7
landscape 7.4
park 7.4
symbol 7.4
man 7.4
brown 7.4
water 7.3
computer 7.2
cool 7.1
creative 7.1
modern 7

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

text 90.6
person 88.2
drawing 88.1
white 65.9

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 48-56
Gender Male, 55.2%
Sad 69.5%
Calm 17.6%
Confused 5.6%
Happy 4.3%
Surprised 1.1%
Angry 0.8%
Disgusted 0.7%
Fear 0.4%

AWS Rekognition

Age 11-19
Gender Male, 95.1%
Fear 40.2%
Happy 22.2%
Calm 21.7%
Disgusted 9.3%
Sad 3.5%
Surprised 1.5%
Confused 0.9%
Angry 0.7%

AWS Rekognition

Age 25-35
Gender Male, 61.7%
Happy 73.2%
Sad 10.9%
Surprised 5.6%
Calm 4%
Fear 2.1%
Angry 1.9%
Disgusted 1.5%
Confused 0.8%

AWS Rekognition

Age 21-29
Gender Male, 98.5%
Calm 97.8%
Sad 1.3%
Happy 0.6%
Confused 0.1%
Angry 0.1%
Fear 0.1%
Surprised 0.1%
Disgusted 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Person 93.1%
Person 92.9%
Person 84.5%
Person 83%
Person 79.9%
Person 77.7%
Person 65.3%
Person 63.5%
Person 62.7%
Person 60.3%
Person 59.2%
Person 48.6%
Person 45.2%
Person 42.3%

Categories

Text analysis

Amazon

EAD

Google

YT37A°2-XAG
YT37A°2-XAG