Human Generated Data

Title

Untitled (people celebrating New Year's Eve)

Date

1949

People

Artist: Robert Burian, American active 1940s-1950s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19367

Human Generated Data

Title

Untitled (people celebrating New Year's Eve)

People

Artist: Robert Burian, American active 1940s-1950s

Date

1949

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19367

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Person 99.3
Human 99.3
Person 99
Person 98.7
Person 98.2
Photographer 97.4
Person 90.8
Person 83.3
Musician 74.6
Musical Instrument 74.6
Paparazzi 73.4
Person 65.3
Person 64.9
Electronics 61
Interior Design 58.6
Indoors 58.6
Camera 57.8
Person 47.9

Clarifai
created on 2023-10-22

people 99.6
music 99.5
group 99.1
musician 98.5
group together 98.4
woman 97.2
adult 95.9
monochrome 95.6
man 95.2
instrument 95
singer 95
stringed instrument 94.8
guitar 94.5
dancing 91.7
adolescent 91.3
violin 90.2
child 90.1
guitarist 88.2
many 88
recreation 87.8

Imagga
created on 2022-03-05

people 28.4
man 28.2
person 26.2
adult 21.8
teacher 18.7
salon 18.3
fashion 16.6
clothing 15.6
male 14.9
style 14.8
performer 14.8
attractive 14.7
dress 14.4
room 14.1
professional 14
pretty 14
dancer 13.8
happy 13.1
lifestyle 13
group 12.9
sexy 12.8
women 12.6
brass 12.6
weapon 11.9
interior 11.5
educator 11.4
couple 11.3
ball 11.2
wind instrument 10.7
fun 10.5
football helmet 10.4
home 10.4
business 10.3
happiness 10.2
helmet 10.2
two 10.1
indoor 10
handsome 9.8
sports equipment 9.7
sword 9.6
together 9.6
black 9.6
love 9.5
headdress 9.5
men 9.4
photographer 9.4
smiling 9.4
basketball 9.3
life 9.3
inside 9.2
girls 9.1
musical instrument 9
indoors 8.8
entertainer 8.5
clothes 8.4
portrait 8.4
active 8.4
leisure 8.3
occupation 8.2
gorgeous 8.1
team 8.1
family 8
model 7.8
casual 7.6
equipment 7.6
human 7.5
window 7.5
lady 7.3
smile 7.1
to 7.1
job 7.1
boxing glove 7.1
modern 7

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

person 99.7
text 88
black and white 85.3
clothing 70.1
group 61.2

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 35-43
Gender Female, 97.2%
Sad 91.9%
Calm 5.6%
Surprised 0.8%
Fear 0.5%
Confused 0.5%
Disgusted 0.3%
Angry 0.2%
Happy 0.2%

AWS Rekognition

Age 12-20
Gender Male, 95.9%
Sad 70.9%
Calm 26.8%
Angry 0.8%
Happy 0.7%
Confused 0.3%
Disgusted 0.2%
Surprised 0.2%
Fear 0.1%

AWS Rekognition

Age 23-33
Gender Male, 96.1%
Surprised 85.5%
Calm 4.2%
Disgusted 4.1%
Confused 2.6%
Angry 2.4%
Fear 0.5%
Sad 0.5%
Happy 0.3%

AWS Rekognition

Age 16-24
Gender Female, 51.9%
Calm 74.5%
Fear 7.7%
Sad 6.5%
Angry 4.7%
Surprised 2%
Disgusted 2%
Happy 1.6%
Confused 0.9%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Person 99.3%
Person 99%
Person 98.7%
Person 98.2%
Person 90.8%
Person 83.3%
Person 65.3%
Person 64.9%
Person 47.9%

Categories

Text analysis

Amazon

as
KODAKSA