Human Generated Data

Title

Untitled (eight men posed around fancy dinner table during insurance company banquet)

Date

c. 1930-1945

People

Artist: O. B. Porter Studio, American active 1930s-1940s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10939

Human Generated Data

Title

Untitled (eight men posed around fancy dinner table during insurance company banquet)

People

Artist: O. B. Porter Studio, American active 1930s-1940s

Date

c. 1930-1945

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10939

Machine Generated Data

Tags

Amazon
created on 2022-02-05

Person 99
Human 99
Person 98.8
Person 98.7
Person 98.5
Person 97
Person 96.5
Person 96.2
Chair 95.6
Furniture 95.6
Priest 91.8
Bishop 71.8
Person 70.5
Chair 66
People 65.5
Home Decor 64.5
Pope 57.1

Clarifai
created on 2023-10-29

people 99.9
group 98.8
man 98.5
group together 98.2
adult 97.4
woman 96.9
leader 93.5
many 93.2
child 88.3
indoors 88.2
chair 85.1
hotel 84.1
furniture 84
room 82.7
military 81
table 80.9
administration 80.3
restaurant 78.2
monochrome 78
dining room 77.8

Imagga
created on 2022-02-05

percussion instrument 84.9
musical instrument 65.8
marimba 62
waiter 40.2
steel drum 35.9
dining-room attendant 31.5
man 30.2
people 28.4
male 27.6
employee 27.6
person 24
men 22.3
worker 22.1
adult 21.6
happy 21.3
business 20.6
smiling 20.2
home 19.9
mature 18.6
smile 17.8
businessman 16.8
lifestyle 16.6
colleagues 16.5
sitting 15.4
office 14.4
indoors 14
senior 14
women 13.4
together 13.1
standing 13
house 12.5
family 12.4
interior 12.4
businesspeople 12.3
meeting 12.2
doctor 12.2
couple 12.2
group 12.1
modern 11.9
room 11.8
businesswoman 11.8
team 11.6
30s 11.5
working 11.5
portrait 11
happiness 11
40s 10.7
medical 10.6
professional 10.6
talking 10.4
old 10.4
nurse 10.4
kitchen 10
color 10
barbershop 9.8
cheerful 9.7
hospital 9.7
shop 9.6
four 9.6
table 9.5
teamwork 9.3
20s 9.2
indoor 9.1
health 9
chair 8.9
affectionate 8.7
love 8.7
corporate 8.6
life 8.5
casual 8.5
friendly 8.2
children 8.2
to 8
work 7.9
discussing 7.8
day 7.8
boy 7.8
classroom 7.8
businessmen 7.8
education 7.8
discussion 7.8
full length 7.8
attractive 7.7
two 7.6
laughing 7.6
communication 7.6
desk 7.6
executive 7.4
patient 7.1
face 7.1

Google
created on 2022-02-05

White 92.2
Table 90.1
Window 89.9
Black 89.7
Chair 88.4
Black-and-white 82.1
Building 80.9
Monochrome photography 76.7
Monochrome 76.5
Classic 74.2
Event 73.8
Vintage clothing 71.8
History 65.8
Sitting 64.2
Stock photography 64.2
Room 64
Child 54.4
Art 54.2
Suit 53.6
Family 53.1

Microsoft
created on 2022-02-05

table 95.5
outdoor 94.6
candle 77.7
clothing 73.1
tableware 72.9
text 71.1
man 69.6
person 69.1
furniture 62.9
vase 57.9
chair 52.3
old 48.3
dining table 7.8

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 48-56
Gender Male, 69.7%
Calm 53.1%
Happy 32.9%
Confused 4.3%
Sad 3.6%
Surprised 2.2%
Disgusted 1.8%
Angry 1.3%
Fear 0.7%

AWS Rekognition

Age 48-56
Gender Male, 99.7%
Calm 60.3%
Happy 14.4%
Surprised 8.3%
Sad 8.1%
Fear 3%
Disgusted 2.8%
Confused 1.6%
Angry 1.6%

AWS Rekognition

Age 38-46
Gender Male, 94.7%
Calm 86.3%
Happy 5%
Sad 4.5%
Surprised 1.6%
Confused 1.2%
Disgusted 0.5%
Angry 0.4%
Fear 0.3%

AWS Rekognition

Age 52-60
Gender Male, 99.8%
Happy 97.1%
Surprised 0.7%
Calm 0.6%
Sad 0.5%
Confused 0.5%
Fear 0.3%
Disgusted 0.2%
Angry 0.2%

AWS Rekognition

Age 43-51
Gender Male, 94.6%
Sad 29.3%
Happy 16.2%
Confused 16.1%
Fear 14.8%
Surprised 10.5%
Calm 8.1%
Angry 2.8%
Disgusted 2.2%

AWS Rekognition

Age 48-54
Gender Male, 99.6%
Calm 80.5%
Surprised 5.1%
Sad 5%
Happy 2.8%
Fear 2.5%
Angry 1.6%
Disgusted 1.6%
Confused 0.9%

AWS Rekognition

Age 52-60
Gender Male, 99.7%
Calm 93.5%
Sad 2.1%
Surprised 1.5%
Confused 1.2%
Happy 0.9%
Disgusted 0.3%
Angry 0.3%
Fear 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Chair
Person 99%
Person 98.8%
Person 98.7%
Person 98.5%
Person 97%
Person 96.5%
Person 96.2%
Person 70.5%
Chair 95.6%
Chair 66%

Categories

Text analysis

Amazon

m
SYLETA
KIEW
Winners
VEEV SYLETA KIEW
VEEV
2000

Google

Winnerg - MJI1 YT3362
Winnerg
-
MJI1
YT3362