Human Generated Data

Title

Untitled (Demonstration of the Surgical Use of Ether)

Date

1847

People

Artist: Southworth & Hawes, American active 1843-1863

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Loan from the Massachusetts General Hospital Archives and Special Collections, 2.1979

Human Generated Data

Title

Untitled (Demonstration of the Surgical Use of Ether)

People

Artist: Southworth & Hawes, American active 1843-1863

Date

1847

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Loan from the Massachusetts General Hospital Archives and Special Collections, 2.1979

Machine Generated Data

Tags

Amazon
created on 2022-05-28

Person 99.5
Human 99.5
Person 99.5
Person 98.6
Person 98.5
Person 90.3
Person 89.4
Person 87.2
Person 73.6
People 70.9
Person 64.9
Officer 64.7
Military Uniform 64.7
Military 64.7
Crowd 64.7
Text 63.2
Painting 60.3
Art 60.3
Sailor Suit 56.3

Clarifai
created on 2023-10-30

people 100
group 99.5
group together 99
adult 98.7
vehicle 98.5
transportation system 97.8
man 96.4
many 95.1
wear 94.4
leader 94.1
military 93.5
art 91.4
woman 91.4
several 91.3
uniform 90.4
three 90
child 89.3
print 88.9
two 88.7
aircraft 87.4

Imagga
created on 2022-05-28

man 27.1
people 22.3
male 21.4
silhouette 19
person 19
musical instrument 18.8
adult 15.9
black 15.1
wind instrument 15.1
world 13.5
men 12.9
business 11.5
grunge 11.1
businessman 10.6
couple 10.4
device 10.2
dark 10
together 9.6
sky 9.6
art 9.2
dirty 9
group 8.9
success 8.8
sport 8.7
women 8.7
light 8.7
accordion 8.6
walk 8.6
old 8.4
fashion 8.3
human 8.2
happy 8.1
aged 8.1
brass 8.1
sunset 8.1
team 8.1
night 8
portrait 7.8
party 7.7
outdoor 7.6
texture 7.6
vintage 7.6
outdoors 7.5
keyboard instrument 7.4
style 7.4
symbol 7.4
retro 7.4
street 7.4
blackboard 7.4
teenager 7.3
suit 7.2
shadow 7.2
photographer 7.1
love 7.1
equipment 7.1
work 7.1

Google
created on 2022-05-28

Motor vehicle 90.7
Suit 75.5
Rectangle 72
Classic 69
Vehicle 66.2
Font 63.8
History 62.5
Crew 57.4
Vintage clothing 56.6
Team 52.9

Microsoft
created on 2022-05-28

text 99.7
clothing 97.1
person 95
man 92.5
black and white 78.5
old 52.4

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 33-41
Gender Male, 92.5%
Angry 46.4%
Calm 29.3%
Surprised 9.9%
Confused 8.3%
Sad 7.1%
Fear 5.9%
Disgusted 0.9%
Happy 0.5%

AWS Rekognition

Age 26-36
Gender Male, 99.9%
Sad 100%
Surprised 6.3%
Fear 6%
Calm 0.8%
Angry 0.5%
Disgusted 0.4%
Confused 0.2%
Happy 0%

AWS Rekognition

Age 58-66
Gender Male, 100%
Calm 99.5%
Surprised 6.3%
Fear 5.9%
Sad 2.2%
Disgusted 0%
Angry 0%
Confused 0%
Happy 0%

AWS Rekognition

Age 29-39
Gender Female, 71.4%
Fear 40.4%
Calm 18.8%
Confused 16.3%
Surprised 11.3%
Sad 9.8%
Disgusted 4.2%
Happy 3.5%
Angry 3%

AWS Rekognition

Age 25-35
Gender Male, 92.2%
Sad 89.3%
Fear 27.9%
Calm 13.7%
Surprised 8.5%
Disgusted 6.2%
Happy 3.5%
Confused 2.4%
Angry 1.7%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.5%
Painting 60.3%

Categories

Text analysis

Google

Inainte