Human Generated Data

Title

Untitled (group portrait of eleven male tennis players standing on tennis court)

Date

1955-1960

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9627

Human Generated Data

Title

Untitled (group portrait of eleven male tennis players standing on tennis court)

People

Artist: Martin Schweig, American 20th century

Date

1955-1960

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Human 99.8
Person 99.8
Person 99.6
Person 99.4
Person 99.3
Shorts 99.1
Clothing 99.1
Apparel 99.1
Person 99
Person 98.9
Person 98.9
Person 97.1
Person 96.8
Person 93.8
Person 92.9
People 81.6
Sailor Suit 74.2
Military 66.3
Military Uniform 64.3
Officer 59.8
Sports 59
Sport 59
Flooring 56
Racket 55.6
Tennis Racket 55.6

Imagga
created on 2022-01-23

marimba 31.6
percussion instrument 25.9
picket fence 23.7
pedestrian 22.7
fence 22.5
musical instrument 20.1
group 19.3
people 17.3
landscape 14.9
men 14.6
crowd 14.4
barrier 14.4
silhouette 14.1
sky 14
structure 13.7
spectator 13.6
male 12.1
travel 12
person 11.7
scene 11.2
building 10.4
black 10.2
man 9.8
old 9.8
business 9.7
obstruction 9.7
summer 9.6
monument 9.3
teamwork 9.3
city 9.1
tourism 9.1
outdoors 9
women 8.7
sea 8.6
architecture 8.6
sitting 8.6
adult 8.5
winter 8.5
tree 8.5
snow 8.2
team 8.1
sun 8
success 8
mountain 8
holiday 7.9
urban 7.9
stone 7.7
art 7.6
field 7.5
house 7.5
friends 7.5
wood 7.5
friendship 7.5
row 7.4
tourist 7.4
back 7.3
national 7.2
meadow 7.2
rural 7
scenic 7

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

text 97.5
baseball 95.3
black 86.4
person 84.8
white 78.8
posing 72.9
clothing 59
man 56.2
old 55.7
team 31.3

Face analysis

Amazon

Google

AWS Rekognition

Age 26-36
Gender Male, 95.1%
Happy 58.1%
Sad 17.6%
Fear 8.1%
Disgusted 6.2%
Angry 4.2%
Calm 2.8%
Confused 1.6%
Surprised 1.4%

AWS Rekognition

Age 33-41
Gender Male, 55.8%
Calm 89.5%
Sad 6.1%
Confused 1.8%
Disgusted 0.7%
Angry 0.6%
Surprised 0.4%
Happy 0.4%
Fear 0.4%

AWS Rekognition

Age 24-34
Gender Female, 50.8%
Calm 99.7%
Sad 0.1%
Happy 0%
Confused 0%
Fear 0%
Disgusted 0%
Angry 0%
Surprised 0%

AWS Rekognition

Age 47-53
Gender Male, 82.6%
Calm 50.8%
Surprised 40.2%
Angry 2.9%
Disgusted 2.3%
Happy 1.8%
Fear 0.9%
Sad 0.7%
Confused 0.4%

AWS Rekognition

Age 25-35
Gender Male, 93.5%
Calm 62.6%
Sad 18.9%
Disgusted 8.6%
Happy 2.9%
Angry 2.9%
Confused 2.3%
Fear 1.1%
Surprised 0.7%

AWS Rekognition

Age 39-47
Gender Male, 98.2%
Confused 28.2%
Sad 19.9%
Calm 16.7%
Fear 11.9%
Disgusted 10.5%
Happy 5.5%
Surprised 4.9%
Angry 2.5%

AWS Rekognition

Age 38-46
Gender Female, 98.9%
Calm 98.3%
Sad 0.6%
Disgusted 0.3%
Confused 0.2%
Surprised 0.2%
Happy 0.1%
Angry 0.1%
Fear 0.1%

AWS Rekognition

Age 22-30
Gender Male, 91.7%
Calm 81.1%
Disgusted 5.8%
Angry 5.6%
Sad 3.5%
Confused 1.8%
Surprised 1.1%
Happy 0.6%
Fear 0.5%

AWS Rekognition

Age 18-24
Gender Male, 96.9%
Calm 94.9%
Surprised 2.2%
Sad 1.6%
Disgusted 0.4%
Confused 0.4%
Fear 0.2%
Happy 0.2%
Angry 0.1%

AWS Rekognition

Age 29-39
Gender Female, 98.1%
Calm 54.2%
Happy 25.1%
Surprised 7.8%
Disgusted 7%
Fear 2.7%
Angry 1.4%
Confused 1.3%
Sad 0.5%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.8%

Captions

Microsoft

a group of people posing for a photo 91.4%
a group of people posing for a picture 91.3%
a group of people posing for the camera 91.2%

Text analysis

Amazon

Soly
Banefi

Google

T37A°2-
-
MJI7--Y T37A°2- - XAGON
MJI7--Y
XAGON