Human Generated Data

Title

Untitled (men and woman at table at party)

Date

1952

People

Artist: Peter James Studio, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.20280

Human Generated Data

Title

Untitled (men and woman at table at party)

People

Artist: Peter James Studio, American

Date

1952

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.20280

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Person 99.6
Human 99.6
Person 99.3
Clothing 98.6
Apparel 98.6
Person 98.6
Suit 92.3
Overcoat 92.3
Coat 92.3
Restaurant 89.4
Sitting 78.3
Piano 77.4
Musical Instrument 77.4
Leisure Activities 77.4
Shirt 73.8
Tuxedo 72.6
Meal 68.8
Food 68.8
Lighting 63.8
Waiter 63.1
Cafeteria 58.8
Female 56
Transportation 55.3

Clarifai
created on 2023-10-22

people 99.7
man 98.7
adult 98.4
woman 98.1
two 95.3
group 94
child 93.9
administration 92.1
three 91.8
group together 90.1
vehicle 88.8
indoors 88.4
education 87.2
sit 85.8
boy 85.7
employee 85.3
music 85
wear 84.6
portrait 83.6
four 83.6

Imagga
created on 2022-03-05

man 48.4
passenger 47.1
male 40.4
laptop 37.6
office 36.8
business 34
adult 33.2
sitting 32.6
people 32.3
happy 30.1
businessman 30
computer 28.9
work 28.2
person 23.8
corporate 23.2
smiling 23.1
worker 22.5
men 22.3
table 22.2
job 22.1
meeting 21.7
smile 21.4
working 21.2
modern 21
professional 20.6
team 20.6
executive 20.3
suit 19.8
communication 19.3
together 19.3
desk 18.9
businesswoman 18.2
businesspeople 18
indoors 17.6
lifestyle 17.3
group 16.9
manager 16.8
teamwork 16.7
happiness 16.4
technology 16.3
looking 16
women 15.8
cheerful 15.4
success 15.3
talking 15.2
handsome 15.1
mature 14.9
room 14.6
colleagues 14.6
couple 13.9
two 13.5
confident 12.7
casual 12.7
boss 12.4
portrait 12.3
color 12.2
successful 11.9
education 11.2
senior 11.2
home 11.2
mid adult 10.6
conference 9.8
leader 9.6
workplace 9.5
day 9.4
company 9.3
finance 9.3
joy 9.2
indoor 9.1
hand 9.1
40s 8.8
employee 8.6
elderly 8.6
staff 8.6
friends 8.5
attractive 8.4
phone 8.3
human 8.2
outdoors 8.2
grandfather 8.1
family 8
bright 7.9
standing 7.8
discussion 7.8
chair 7.8
using 7.7
percussion instrument 7.7
tie 7.6
adults 7.6
career 7.6
car 7.5
keyboard 7.5
enjoyment 7.5
fun 7.5
notebook 7.5
glasses 7.4
camera 7.4
positive 7.4
occupation 7.3
alone 7.3
love 7.1

Microsoft
created on 2022-03-05

black and white 95.3
person 94
man 87.9
clothing 87.7
human face 72.5
monochrome 57.2
table 28.9

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 50-58
Gender Male, 54.4%
Happy 77.8%
Calm 19.4%
Surprised 1.5%
Sad 0.5%
Confused 0.4%
Disgusted 0.2%
Fear 0.1%
Angry 0.1%

AWS Rekognition

Age 30-40
Gender Male, 99.1%
Calm 99.7%
Sad 0.1%
Surprised 0.1%
Angry 0%
Disgusted 0%
Happy 0%
Fear 0%
Confused 0%

AWS Rekognition

Age 42-50
Gender Male, 61.5%
Calm 44.7%
Happy 40.7%
Confused 5.9%
Surprised 3.8%
Sad 3.4%
Disgusted 0.8%
Fear 0.4%
Angry 0.4%

AWS Rekognition

Age 18-24
Gender Female, 87.8%
Calm 66.2%
Fear 23.2%
Sad 3%
Disgusted 2.3%
Angry 2.1%
Happy 1.9%
Confused 1%
Surprised 0.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Piano
Person 99.6%
Person 99.3%
Person 98.6%
Piano 77.4%

Text analysis

Amazon

e8
YAGOX