Human Generated Data

Title

Britain at War (London taxi driver knitting for war effort)

Date

1939

People

Artist: Carl Mydans, American 1907 - 2004

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of the Estate of Carl Mydans, P2005.36

Human Generated Data

Title

Britain at War (London taxi driver knitting for war effort)

People

Artist: Carl Mydans, American 1907 - 2004

Date

1939

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of the Estate of Carl Mydans, P2005.36

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Person 99.7
Human 99.7
Aircraft 77
Vehicle 77
Transportation 77
Airplane 77
Finger 73
Machine 70.9
Face 64.7
Meal 62.5
Food 62.5
Clothing 62.3
Apparel 62.3
Animal 60.9
Overcoat 58.8
Coat 58.8
Spoke 55.2

Clarifai
created on 2023-10-26

people 99.9
adult 99.5
man 99.2
vehicle 99.1
one 98.9
portrait 98.5
transportation system 98.3
car 96.2
two 95.5
war 95.5
military 95.5
monochrome 93.5
aircraft 92.6
aviate 91.3
woman 91.1
airplane 88.6
sitting 88.2
lid 88
veil 87.4
street 86.6

Imagga
created on 2022-01-23

person 24
people 23.4
car 23.2
seat belt 23.1
man 22.8
helmet 22.1
vehicle 21.3
adult 20.7
black 19.6
clothing 18.5
safety belt 18.5
male 17.8
seat 17.3
device 16.8
restraint 16.1
model 15.5
fashion 15.1
portrait 14.9
sexy 14.5
automobile 14.4
human 14.2
one 14.2
driver 13.8
support 13.5
attractive 13.3
lifestyle 13
cockpit 12.9
men 12.9
face 12.1
headdress 12
sitting 12
football helmet 11.5
auto 11.5
equipment 11.5
studio 10.6
transport 10
happy 10
color 10
hand 9.9
transportation 9.9
sport 9.8
engine 9.6
body 9.6
women 9.5
smile 9.3
elegance 9.2
style 8.9
job 8.8
wheel 8.8
looking 8.8
hair 8.7
driving 8.7
work 8.6
drive 8.5
pretty 8.4
emotion 8.3
safety 8.3
crash helmet 8.2
protection 8.2
interior 8
smiling 8
love 7.9
mask 7.8
brunette 7.8
repair 7.7
youth 7.7
dark 7.5
sensual 7.3
exercise 7.3
worker 7.1
posing 7.1
happiness 7

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

cartoon 90.6
outdoor 87.8
text 83.3
black and white 80.6
man 77.4
person 76.2
human face 71
white 61.5
old 57.3
clothing 56.3
drawing 54.2

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 47-53
Gender Male, 100%
Calm 83%
Surprised 8.5%
Confused 6.9%
Fear 0.9%
Disgusted 0.4%
Angry 0.2%
Sad 0.1%
Happy 0.1%

Microsoft Cognitive Services

Age 57
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%
Airplane 77%

Categories

Imagga

paintings art 63.4%
food drinks 35.1%
people portraits 1.1%

Captions

Text analysis

Amazon

08
CAB
DRINER