Human Generated Data

Title

Untitled (couples seated at tables near tent at crowded party)

Date

1948-1950

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9284

Human Generated Data

Title

Untitled (couples seated at tables near tent at crowded party)

People

Artist: Martin Schweig, American 20th century

Date

1948-1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9284

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Person 99
Human 99
Person 98.8
Musician 98.3
Musical Instrument 98.3
Person 98.3
Person 97.4
Person 95.9
Person 94.9
Person 94.5
Leisure Activities 90
Person 88.9
Person 85.9
Music Band 85.5
Guitarist 80.4
Guitar 80.4
Performer 80.4
Person 66.5
Person 63.3
Crowd 61
Sunglasses 60.5
Accessories 60.5
Accessory 60.5
Poster 58.6
Advertisement 58.6

Clarifai
created on 2023-10-26

people 99.9
group 98.9
group together 98.4
adult 98.4
many 96.8
music 96.3
woman 95.9
man 95.8
child 94.6
musician 92.6
recreation 91.2
sitting 89.7
monochrome 87.9
sit 87.8
wear 87.4
leader 87.3
boy 86.2
chair 85.4
administration 85.1
education 84.5

Imagga
created on 2022-01-23

hairdresser 58.2
barbershop 57.4
shop 46.6
mercantile establishment 35.4
man 30.2
people 27.9
salon 24.9
place of business 23.9
male 20.5
person 16.4
adult 15.2
senior 14
men 13.7
hand 13.7
music 13.5
happy 13.1
black 12.6
room 12.1
establishment 12
work 11.8
smiling 11.6
lifestyle 11.6
women 11.1
together 10.5
human 10.5
group 10.5
old 10.4
style 10.4
hair 10.3
mature 10.2
smile 10
instrument 9.5
sitting 9.4
care 9
working 8.8
concert 8.7
couple 8.7
musical 8.6
two 8.5
portrait 8.4
modern 8.4
health 8.3
worker 8
art 7.9
indoors 7.9
hands 7.8
rock 7.8
husband 7.6
wife 7.6
club 7.5
leisure 7.5
retro 7.4
back 7.3
occupation 7.3
business 7.3
holiday 7.2
handsome 7.1
family 7.1
night 7.1
love 7.1
medical 7.1

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

person 99.7
text 94.5
musical instrument 92.7
concert 86
clothing 85.6
guitar 80.3
people 79.6
human face 68.1
group 67.4
black and white 62.8
man 56
crowd 26

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 36-44
Gender Male, 75.1%
Happy 96.4%
Sad 0.8%
Calm 0.7%
Fear 0.7%
Surprised 0.5%
Disgusted 0.4%
Angry 0.3%
Confused 0.2%

AWS Rekognition

Age 37-45
Gender Male, 66.6%
Sad 41%
Happy 30.7%
Calm 15%
Fear 7.5%
Confused 2.5%
Disgusted 1.2%
Surprised 1.1%
Angry 1.1%

AWS Rekognition

Age 23-33
Gender Male, 72.7%
Calm 48.6%
Surprised 18.7%
Sad 9.3%
Disgusted 8%
Angry 5.7%
Happy 3.9%
Confused 3.4%
Fear 2.5%

AWS Rekognition

Age 31-41
Gender Male, 84.6%
Calm 98.4%
Happy 0.7%
Sad 0.3%
Surprised 0.2%
Confused 0.2%
Angry 0.1%
Disgusted 0.1%
Fear 0%

AWS Rekognition

Age 29-39
Gender Female, 60%
Calm 99.8%
Sad 0.2%
Happy 0%
Angry 0%
Surprised 0%
Confused 0%
Disgusted 0%
Fear 0%

Feature analysis

Amazon

Person 99%
Sunglasses 60.5%
Poster 58.6%

Text analysis

Amazon

DON

Google

DOA YT37A2- A
DOA
YT37A2-
A