Human Generated Data

Title

Untitled (people at table at party)

Date

1964

People

Artist: Robert Burian, American active 1940s-1950s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19205

Human Generated Data

Title

Untitled (people at table at party)

People

Artist: Robert Burian, American active 1940s-1950s

Date

1964

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19205

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Person 99.5
Human 99.5
Person 99.5
Person 99.1
Person 99
Person 98.8
Furniture 98.8
Tabletop 98.6
Person 98.6
Person 98.5
Person 98.1
Crowd 96.6
Person 93.4
Suit 92.6
Clothing 92.6
Overcoat 92.6
Coat 92.6
Apparel 92.6
Person 91.3
Meal 90.7
Food 90.7
Chair 89.4
Person 89.1
Audience 87.7
Dish 86.4
Sunglasses 84.2
Accessories 84.2
Accessory 84.2
Table 83.9
Person 77.6
Tuxedo 77.6
Press Conference 72.2
Dining Table 71
People 69.6
Glass 64.7
Sitting 64.6
Tablecloth 62.5
Flower 62.2
Blossom 62.2
Plant 62.2
Restaurant 58.4
Beverage 57.4
Drink 57.4
Cafeteria 56.5
Linen 55.8
Home Decor 55.8
Person 50.3

Clarifai
created on 2023-10-22

people 99.9
group 98.8
woman 97.9
adult 97.7
man 97.6
sit 95.6
furniture 93.2
group together 92.7
chair 92.6
room 91.3
monochrome 91
administration 90.2
many 88.8
recreation 88.5
music 86.8
child 85.5
indoors 84.8
sitting 81.7
leader 81.5
musician 80.4

Imagga
created on 2022-03-05

people 37.9
person 37.4
man 35.6
room 35.5
classroom 31.3
male 29.1
businessman 27.3
meeting 26.4
business 26.1
adult 25.2
businesswoman 24.5
indoors 23.7
happy 23.2
professional 23.1
group 22.5
table 22.5
colleagues 22.3
businesspeople 21.8
office 21.7
teacher 21.3
sitting 20.6
team 20.6
teamwork 20.4
smiling 18.8
working 18.5
corporate 18
together 17.5
lifestyle 17.3
work 17.3
men 17.2
desk 17
computer 16.8
communication 16.8
couple 16.5
talking 15.2
mature 14.9
executive 14.3
senior 14
student 13.6
home 13.5
cheerful 13
indoor 12.8
women 12.6
suit 12.6
job 12.4
worker 12.1
patient 12.1
educator 11.9
associates 11.8
coworkers 11.8
discussing 11.8
center 11.3
casual 11
laptop 10.9
discussion 10.7
four 10.5
workplace 10.5
portrait 10.3
happiness 10.2
40s 9.7
technology 9.6
chair 9.5
day 9.4
friends 9.4
presentation 9.3
inside 9.2
20s 9.2
new 8.9
crew 8.8
staff 8.8
restaurant 8.8
cooperation 8.7
corporation 8.7
education 8.7
30s 8.7
screen 8.4
modern 8.4
horizontal 8.4
barbershop 8.4
fan 8.3
occupation 8.2
successful 8.2
looking 8
enrollee 8
life 8
briefing 7.9
love 7.9
smile 7.8
casual clothing 7.8
conference 7.8
businessmen 7.8
color 7.8
shop 7.8
old 7.7
two 7.6
contemporary 7.5
camera 7.4
success 7.2
family 7.1
nurse 7.1
hall 7

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

person 99.5
text 99.1
clothing 87.7
man 81.9
people 73.6
table 72.8
group 67.3
black and white 56.7
chair 53.3
crowd 0.6

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 48-54
Gender Male, 97.9%
Sad 67.6%
Surprised 12.1%
Happy 5.7%
Fear 4.1%
Calm 4%
Confused 2.7%
Disgusted 2%
Angry 1.8%

AWS Rekognition

Age 38-46
Gender Male, 97.6%
Calm 80.3%
Happy 18.2%
Confused 0.4%
Sad 0.4%
Surprised 0.2%
Disgusted 0.2%
Angry 0.1%
Fear 0.1%

AWS Rekognition

Age 49-57
Gender Male, 88.3%
Happy 93.4%
Disgusted 4.4%
Calm 0.8%
Confused 0.6%
Surprised 0.4%
Angry 0.2%
Sad 0.2%
Fear 0.1%

AWS Rekognition

Age 48-54
Gender Male, 99.7%
Happy 25.4%
Sad 23.4%
Calm 19.5%
Confused 10.3%
Disgusted 8.2%
Fear 6.6%
Surprised 5.4%
Angry 1.2%

AWS Rekognition

Age 40-48
Gender Male, 99.9%
Surprised 56.6%
Sad 20.1%
Fear 10.9%
Happy 5.5%
Calm 2.8%
Confused 2.6%
Disgusted 1%
Angry 0.5%

AWS Rekognition

Age 54-62
Gender Male, 94.5%
Sad 62.4%
Calm 34%
Confused 1.6%
Happy 0.5%
Disgusted 0.5%
Surprised 0.4%
Angry 0.3%
Fear 0.2%

AWS Rekognition

Age 39-47
Gender Male, 99.6%
Confused 58.8%
Sad 21.2%
Calm 12.9%
Surprised 3%
Disgusted 1.5%
Happy 1.3%
Angry 0.9%
Fear 0.4%

AWS Rekognition

Age 29-39
Gender Male, 97.7%
Fear 65.8%
Calm 13.9%
Sad 9.3%
Angry 4.9%
Surprised 3.3%
Confused 1.1%
Disgusted 0.9%
Happy 0.9%

AWS Rekognition

Age 48-54
Gender Female, 59.4%
Calm 79.4%
Happy 17.2%
Sad 1.6%
Fear 0.8%
Disgusted 0.4%
Angry 0.3%
Surprised 0.2%
Confused 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Chair
Sunglasses
Person 99.5%
Person 99.5%
Person 99.1%
Person 99%
Person 98.8%
Person 98.6%
Person 98.5%
Person 98.1%
Person 93.4%
Person 91.3%
Person 89.1%
Person 77.6%
Person 50.3%
Chair 89.4%
Sunglasses 84.2%

Categories

Text analysis

Amazon

4
с
MJ17
MAGOM
g
MAGOX
MJ17 YY3RAS
MJIR
MJIR YY37A2
YY37A2
YY3RAS

Google

4. MJI7 YT3 A2 MAGOX MJI3 Y T 37 A2 AAGOX
4.
MJI7
YT3
A2
MAGOX
MJI3
Y
T
37
AAGOX