Human Generated Data

Title

Untitled (portrait of a team with rackets)

Date

1943

People

Artist: John Deusing, American active 1940s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.1718

Human Generated Data

Title

Untitled (portrait of a team with rackets)

People

Artist: John Deusing, American active 1940s

Date

1943

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Clothing 99.8
Apparel 99.8
Human 99.8
Person 99.8
Person 99.8
Person 99.5
Person 99.2
Person 99.1
Person 98.8
Person 98.7
Person 98.5
Footwear 98.4
Shoe 98.4
Shoe 98.2
Shoe 97.9
Shoe 97.6
Person 96.9
Shoe 96.7
Shoe 95.1
Shorts 95
Shoe 91.6
Tennis Racket 90.2
Racket 90.2
Shoe 89.6
Face 88.5
Person 88.3
Tennis Racket 87.3
Person 83.7
People 83.5
Shoe 80.3
Outdoors 79.9
Person 73.3
Hat 71
Guitar 70.6
Leisure Activities 70.6
Musical Instrument 70.6
Photo 68.1
Portrait 68.1
Photography 68.1
Female 66.6
Sleeve 63.7
Sailor Suit 59.8
Costume 59.1
Nature 58.8
Dress 57.1
Officer 55.9
Military 55.9
Military Uniform 55.9
Shoe 51.7

Imagga
created on 2021-12-14

brass 100
wind instrument 100
bugle 100
musical instrument 67.6
people 29.5
device 26.4
man 24.8
cornet 24.7
male 24.1
sport 22.2
men 20.6
person 20.2
adult 19.7
play 17.2
active 15.3
playing 14.6
professional 13.3
ball 13.1
group 12.9
competition 12.8
business 12.1
fun 12
game 11.6
lifestyle 11.5
happy 11.3
human 11.2
summer 10.9
exercise 10.9
player 10.9
businessman 10.6
boy 10.4
field 10
baritone 10
athlete 9.8
outdoors 9.7
success 9.6
uniform 9.6
grass 9.5
club 9.4
motion 9.4
day 9.4
guy 9.2
hand 9.1
suit 9
recreation 9
team 8.9
women 8.7
standing 8.7
youth 8.5
portrait 8.4
room 8.3
action 8.3
sky 8.3
park 8.2
golfer 8
smiling 7.9
smile 7.8
happiness 7.8
outdoor 7.6
golf 7.6
casual 7.6
leisure 7.5
new 7.3
music 7.2
weapon 7.2
handsome 7.1
family 7.1
to 7.1
work 7.1

Google
created on 2021-12-14

Microsoft
created on 2021-12-14

person 98.8
text 96.4
clothing 96.4
man 91.3
musical instrument 73.6
group 70.4
people 55.8

Face analysis

Amazon

Google

AWS Rekognition

Age 21-33
Gender Male, 96.5%
Calm 85.6%
Happy 10.3%
Surprised 2.6%
Confused 0.7%
Angry 0.3%
Sad 0.2%
Disgusted 0.2%
Fear 0%

AWS Rekognition

Age 22-34
Gender Female, 50.1%
Calm 91.5%
Happy 3.1%
Surprised 1.7%
Sad 1.5%
Angry 1.1%
Disgusted 0.5%
Confused 0.3%
Fear 0.2%

AWS Rekognition

Age 27-43
Gender Male, 95.3%
Calm 97.3%
Surprised 0.8%
Happy 0.7%
Angry 0.6%
Sad 0.3%
Confused 0.1%
Disgusted 0.1%
Fear 0.1%

AWS Rekognition

Age 21-33
Gender Female, 68.4%
Calm 95.9%
Sad 1.6%
Happy 1.4%
Confused 0.7%
Surprised 0.2%
Angry 0.1%
Disgusted 0%
Fear 0%

AWS Rekognition

Age 22-34
Gender Female, 61.7%
Happy 48.4%
Calm 29.4%
Surprised 14.6%
Sad 2.6%
Confused 2.2%
Angry 1.4%
Fear 0.9%
Disgusted 0.5%

AWS Rekognition

Age 25-39
Gender Male, 72.2%
Calm 89.7%
Happy 7.4%
Sad 0.9%
Surprised 0.8%
Confused 0.4%
Angry 0.4%
Fear 0.2%
Disgusted 0.2%

AWS Rekognition

Age 21-33
Gender Male, 56.3%
Calm 60.5%
Surprised 18.8%
Happy 14.5%
Fear 1.8%
Angry 1.7%
Confused 1.2%
Sad 1%
Disgusted 0.4%

AWS Rekognition

Age 22-34
Gender Male, 90%
Calm 91.7%
Surprised 2.3%
Happy 1.9%
Angry 1.2%
Confused 1.1%
Sad 1%
Fear 0.5%
Disgusted 0.3%

AWS Rekognition

Age 15-27
Gender Male, 96.4%
Calm 86.3%
Surprised 9.1%
Confused 2.2%
Sad 1%
Happy 0.7%
Angry 0.3%
Fear 0.3%
Disgusted 0.1%

AWS Rekognition

Age 22-34
Gender Female, 71.2%
Happy 67.2%
Calm 32.2%
Surprised 0.2%
Sad 0.1%
Disgusted 0.1%
Confused 0.1%
Angry 0%
Fear 0%

AWS Rekognition

Age 22-34
Gender Male, 96.4%
Calm 86.7%
Surprised 7.9%
Happy 1.9%
Confused 1.9%
Angry 1%
Sad 0.3%
Disgusted 0.1%
Fear 0.1%

AWS Rekognition

Age 21-33
Gender Female, 81.7%
Happy 52.7%
Calm 42.1%
Angry 2.6%
Sad 0.9%
Surprised 0.8%
Confused 0.4%
Fear 0.3%
Disgusted 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.8%
Shoe 98.4%
Guitar 70.6%

Captions

Microsoft

a group of people posing for a photo 90.4%
a group of people posing for the camera 90.3%
a group of people standing next to a window 87.4%