Human Generated Data

Title

Untitled (young men and woman at tables at ball)

Date

1962

People

Artist: Robert Burian, American active 1940s-1950s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19175

Human Generated Data

Title

Untitled (young men and woman at tables at ball)

People

Artist: Robert Burian, American active 1940s-1950s

Date

1962

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Person 99
Human 99
Person 98.3
Person 97.9
Person 97.3
Person 97.2
Person 95.2
Person 92.1
Person 91
Person 90.1
Meal 85.6
Food 85.6
Crowd 83.5
Dating 82.9
Person 81.4
Person 81.3
Person 80.7
Coat 80.6
Apparel 80.6
Suit 80.6
Clothing 80.6
Overcoat 80.6
Restaurant 80.4
Person 79.6
Person 77
People 71.5
Table 67.6
Furniture 67.6
Glass 66.5
Person 61.2
Accessories 60.2
Accessory 60.2
Sunglasses 60.2
Indoors 59
Cafeteria 58.4
Person 58.1
Undershirt 57.5
Dish 57.4
Tuxedo 56
Bar Counter 55.4
Pub 55.4
Drink 55.2
Beverage 55.2
Sitting 55.2

Imagga
created on 2022-03-05

man 45
teacher 41.5
male 39
person 38.8
people 37.4
senior 36.6
room 35.4
adult 32.5
classroom 29.4
home 27.1
table 26
indoors 25.5
men 24.9
professional 24.9
educator 23.8
couple 23.5
happy 22.6
smiling 22.4
meeting 21.7
sitting 20.6
mature 19.5
businessman 18.5
retired 18.4
office 18.4
business 18.2
group 17.7
entrepreneur 17.6
together 17.5
student 17.2
elderly 16.3
retirement 15.4
talking 15.2
lifestyle 15.2
education 14.7
women 14.2
board 13.6
team 13.4
hairdresser 13.4
businesspeople 13.3
interior 13.3
teamwork 13
patient 12.9
businesswoman 12.7
40s 12.7
colleagues 12.6
work 12.6
happiness 12.5
class 12.5
to 12.4
desk 12.3
cheerful 12.2
computer 12
old 11.8
portrait 11.7
working 11.5
smile 11.4
executive 11.2
inside 11
school 10.8
70s 10.8
teaching 10.7
discussion 10.7
hospital 10.7
family 10.7
worker 10.1
indoor 10
hand 9.9
enjoying 9.5
adults 9.5
corporate 9.5
camera 9.2
communication 9.2
wine 9.2
modern 9.1
laptop 9.1
barbershop 9.1
holding 9.1
job 8.9
medical 8.8
casual clothing 8.8
conference 8.8
middle aged 8.8
30s 8.7
chair 8.6
husband 8.6
illness 8.6
wife 8.5
casual 8.5
doctor 8.5
presentation 8.4
horizontal 8.4
blackboard 8.4
occupation 8.3
20s 8.2
specialist 8.2
grandfather 8.2
nurse 8.1
handsome 8
looking 8
restaurant 7.9
grandmother 7.8
boy 7.8
students 7.8
center 7.8
partner 7.7
studying 7.7
exam 7.7
college 7.6
friends 7.5
showing 7.5
study 7.5
technology 7.4
suit 7.2
day 7.1

Google
created on 2022-03-05

Black 89.5
Black-and-white 82.8
Adaptation 79.2
Font 78.8
Suit 78.4
T-shirt 75
Event 73.8
Art 72.8
Monochrome photography 72.4
Monochrome 71.4
Vintage clothing 69.8
Table 68.8
Room 68.4
Crowd 65.6
History 65.5
Photo caption 65.3
Crew 64.8
Stock photography 64.3
Team 60.2
Photographic paper 56.3

Microsoft
created on 2022-03-05

person 99.4
text 98.1
indoor 90.5
clothing 89.8
group 86.5
table 71
man 70.9
human face 66.8
people 62
old 59.7

Face analysis

Amazon

Google

AWS Rekognition

Age 23-33
Gender Female, 76.4%
Calm 98.8%
Sad 0.6%
Angry 0.2%
Happy 0.2%
Confused 0.1%
Disgusted 0.1%
Surprised 0%
Fear 0%

AWS Rekognition

Age 51-59
Gender Male, 93.7%
Calm 99.8%
Sad 0.1%
Happy 0%
Disgusted 0%
Confused 0%
Angry 0%
Surprised 0%
Fear 0%

AWS Rekognition

Age 22-30
Gender Male, 60.3%
Calm 54.6%
Surprised 39.8%
Happy 3%
Disgusted 0.8%
Confused 0.7%
Sad 0.7%
Angry 0.2%
Fear 0.1%

AWS Rekognition

Age 25-35
Gender Female, 72.5%
Calm 85.8%
Sad 5.7%
Happy 2.7%
Surprised 1.8%
Fear 1.7%
Confused 1.5%
Disgusted 0.6%
Angry 0.3%

AWS Rekognition

Age 23-33
Gender Female, 83.6%
Calm 77%
Happy 10.9%
Fear 3.5%
Sad 2.4%
Disgusted 2.1%
Angry 1.5%
Confused 1.4%
Surprised 1.2%

AWS Rekognition

Age 52-60
Gender Male, 99%
Calm 98.4%
Sad 1.4%
Happy 0%
Disgusted 0%
Surprised 0%
Angry 0%
Fear 0%
Confused 0%

AWS Rekognition

Age 52-60
Gender Female, 58.2%
Calm 81.7%
Sad 10.3%
Happy 3.4%
Confused 3%
Surprised 0.6%
Disgusted 0.5%
Angry 0.3%
Fear 0.2%

AWS Rekognition

Age 30-40
Gender Male, 95.6%
Calm 73%
Happy 13.1%
Sad 6.8%
Angry 3%
Confused 1.8%
Disgusted 1.4%
Fear 0.6%
Surprised 0.4%

AWS Rekognition

Age 33-41
Gender Male, 98.5%
Happy 74.4%
Calm 14.6%
Disgusted 3.8%
Angry 3.4%
Surprised 1.5%
Confused 1.1%
Sad 0.8%
Fear 0.5%

AWS Rekognition

Age 39-47
Gender Female, 61.3%
Angry 54.1%
Happy 16.7%
Confused 13.3%
Calm 6.1%
Surprised 4.8%
Disgusted 2.7%
Sad 2%
Fear 0.4%

AWS Rekognition

Age 23-33
Gender Female, 54.7%
Sad 34%
Calm 34%
Confused 18.9%
Angry 7.5%
Disgusted 2.6%
Surprised 1.4%
Fear 0.8%
Happy 0.7%

AWS Rekognition

Age 33-41
Gender Male, 99.9%
Calm 95.7%
Sad 1.3%
Happy 1%
Confused 0.5%
Surprised 0.5%
Angry 0.4%
Disgusted 0.4%
Fear 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Feature analysis

Amazon

Person 99%
Sunglasses 60.2%

Captions

Microsoft

a group of people posing for a photo 86%
a group of people sitting at a table 84.3%
a group of people posing for the camera 84.2%

Text analysis

Amazon

5
5 9
9
MILE
MAGOX
M113 MAGOX
M113
MILE 173342
173342

Google

VEAAS
VEAAS