Human Generated Data

Title

Untitled (thirteen family members posed sitting on patio outdoors)

Date

1960

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9754

Human Generated Data

Title

Untitled (thirteen family members posed sitting on patio outdoors)

People

Artist: Martin Schweig, American 20th century

Date

1960

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-24

Person 99.7
Human 99.7
Person 99
Person 98.8
Person 98.6
Person 98.1
Person 98
Person 97.5
Person 97.4
Person 95.6
Person 95.3
Person 95
Nature 93.9
Outdoors 93.4
People 92.4
Accessory 88.9
Tie 88.9
Accessories 88.9
Shorts 87.8
Clothing 87.8
Apparel 87.8
Person 84.8
Face 79.1
Vacation 76.9
Sand 75.1
Snow 65.5
Drawing 62.4
Art 62.4
Leisure Activities 61.1
Suit 59.9
Overcoat 59.9
Coat 59.9
Water 58.7
Sea 58.7
Coast 58.7
Shoreline 58.7
Ocean 58.7
Beach 58.7
Family 58.2
Steamer 57.9

Imagga
created on 2022-01-24

brass 43.5
wind instrument 40.9
man 30.2
musical instrument 30.2
people 28.4
beach 25.4
cornet 24
male 21.3
silhouette 20.7
person 20.1
sax 20
sea 18
men 17.2
walking 17
trombone 17
sport 16.9
sand 16.8
active 15.4
water 15.3
winter 15.3
outdoor 15.3
vacation 14.7
group 14.5
snow 14.3
women 14.2
ocean 14.2
leisure 14.1
sky 14
adult 14
summer 13.5
fun 13.5
black 12.6
lifestyle 12.3
sunset 11.7
coast 11.7
team 11.6
family 11.6
outdoors 11.2
cold 11.2
holiday 10.7
businessman 10.6
travel 10.6
sun 10.5
relax 10.1
device 9.7
landscape 9.7
run 9.6
couple 9.6
business 9.1
recreation 9
play 8.6
season 8.6
walk 8.6
youth 8.5
grunge 8.5
portrait 8.4
old 8.4
exercise 8.2
activity 8.1
mountain 8
horn 7.9
happiness 7.8
boy 7.8
weather 7.8
child 7.8
happy 7.5
danger 7.3
work 7.1
drawing 7.1
life 7
together 7

Google
created on 2022-01-24

Microsoft
created on 2022-01-24

outdoor 96.6
person 95.5
text 94.1
clothing 91.7
man 88
snow 79.4
posing 70.9
group 58.5
people 56.8
sketch 55.6
drawing 54.7

Face analysis

Amazon

Google

AWS Rekognition

Age 48-54
Gender Male, 99.5%
Sad 89.6%
Confused 4.1%
Happy 3.5%
Disgusted 1.5%
Calm 0.5%
Surprised 0.3%
Fear 0.2%
Angry 0.2%

AWS Rekognition

Age 25-35
Gender Male, 100%
Sad 49.6%
Calm 37.5%
Disgusted 6.2%
Confused 2.4%
Angry 1.9%
Happy 1.2%
Surprised 0.6%
Fear 0.5%

AWS Rekognition

Age 42-50
Gender Male, 98.3%
Sad 46.4%
Happy 28.3%
Confused 11.5%
Calm 7.6%
Surprised 3.1%
Disgusted 1.5%
Angry 1.1%
Fear 0.5%

AWS Rekognition

Age 45-53
Gender Male, 52.3%
Calm 70.9%
Happy 14.6%
Sad 11.6%
Angry 0.9%
Confused 0.8%
Surprised 0.6%
Disgusted 0.5%
Fear 0.2%

AWS Rekognition

Age 35-43
Gender Male, 80.1%
Calm 64.2%
Sad 17%
Happy 10.5%
Confused 3.6%
Angry 2.1%
Surprised 1.2%
Disgusted 0.9%
Fear 0.5%

AWS Rekognition

Age 51-59
Gender Male, 67.5%
Sad 89.1%
Happy 2.8%
Disgusted 2.6%
Fear 2.4%
Confused 0.9%
Calm 0.8%
Angry 0.8%
Surprised 0.5%

AWS Rekognition

Age 36-44
Gender Male, 99.4%
Calm 87.6%
Happy 4%
Sad 2.9%
Disgusted 2.3%
Confused 1.6%
Angry 0.7%
Surprised 0.6%
Fear 0.3%

AWS Rekognition

Age 45-51
Gender Male, 98%
Sad 34.3%
Happy 24.1%
Calm 23.4%
Confused 7.9%
Disgusted 4.2%
Fear 3.2%
Angry 1.8%
Surprised 1.2%

AWS Rekognition

Age 48-54
Gender Male, 99.7%
Happy 53.3%
Confused 13.9%
Surprised 9.1%
Calm 7.9%
Sad 7.9%
Disgusted 3.6%
Angry 2.5%
Fear 1.9%

AWS Rekognition

Age 43-51
Gender Male, 96.1%
Sad 54.3%
Calm 28.4%
Happy 12.9%
Confused 2.3%
Disgusted 0.9%
Surprised 0.6%
Angry 0.4%
Fear 0.2%

AWS Rekognition

Age 27-37
Gender Male, 79.3%
Calm 43.3%
Sad 30%
Confused 12.9%
Surprised 4.8%
Happy 3.3%
Fear 2.6%
Disgusted 2%
Angry 1.1%

AWS Rekognition

Age 45-51
Gender Male, 51%
Happy 58%
Sad 24.5%
Calm 7.8%
Surprised 3.2%
Confused 2.4%
Fear 1.6%
Angry 1.3%
Disgusted 1.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%
Tie 88.9%

Captions

Microsoft

a group of people posing for a photo 98.1%
a group of people posing for the camera 98%
a group of people posing for a picture 97.9%

Text analysis

Amazon

24835

Google

2 e 3 S
e
2
3
S