Human Generated Data

Title

Untitled (Junior League group of man and four women performing kickline in large room with wood floor)

Date

1940-1960

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10033

Human Generated Data

Title

Untitled (Junior League group of man and four women performing kickline in large room with wood floor)

People

Artist: Martin Schweig, American 20th century

Date

1940-1960

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-28

Leisure Activities 99.1
Dance Pose 99.1
Person 98.9
Human 98.9
Person 98.8
Shoe 97.8
Apparel 97.8
Clothing 97.8
Footwear 97.8
Person 94.3
Person 94.3
Dance 94.3
Person 86.7
Ballet 75.2
Shoe 69.4
Ballerina 58.9

Imagga
created on 2022-01-28

people 32.9
person 28.2
man 24.2
room 23.4
adult 23.1
women 22.9
business 20
group 19.3
men 18
male 17.7
happy 17.5
team 17
modern 16.1
interior 15.9
smiling 15.9
lifestyle 15.9
portrait 15.5
businessman 15
work 14.9
attractive 14
office 14
professional 13.6
smile 13.5
classroom 13
corporate 12.9
sitting 12.9
indoor 12.8
home 12.8
indoors 12.3
day 11.8
table 11.7
teacher 11.5
meeting 11.3
teamwork 11.1
house 10.9
chair 10.5
together 10.5
urban 10.5
motion 10.3
dancer 10.2
training 10.2
worker 10.1
city 10
working 9.7
window 9.3
casual 9.3
hospital 9.1
businesswoman 9.1
exercise 9.1
health 9
human 9
cheerful 8.9
furniture 8.8
life 8.7
architecture 8.7
gym 8.6
performer 8.6
negative 8.5
two 8.5
relax 8.4
black 8.4
pretty 8.4
relaxation 8.4
color 8.3
suit 8.1
success 8
happiness 7.8
travel 7.7
train 7.7
crowd 7.7
talking 7.6
finance 7.6
walking 7.6
communication 7.6
togetherness 7.6
strength 7.5
one 7.5
manager 7.4
floor 7.4
holding 7.4
executive 7.4
building 7.3
girls 7.3
transportation 7.2
seat 7.1
glass 7.1
film 7.1

Google
created on 2022-01-28

Microsoft
created on 2022-01-28

text 95.8
dance 94.7
person 85.8
footwear 84.5
clothing 82.8
drawing 72.8
woman 70.6

Face analysis

Amazon

Google

AWS Rekognition

Age 39-47
Gender Female, 74%
Sad 34.9%
Surprised 30.8%
Calm 17.8%
Disgusted 8.2%
Happy 4.2%
Confused 2.3%
Angry 1.1%
Fear 0.8%

AWS Rekognition

Age 42-50
Gender Male, 62.3%
Happy 92.6%
Sad 2.4%
Confused 1.5%
Surprised 1.4%
Calm 1.1%
Disgusted 0.5%
Fear 0.3%
Angry 0.2%

AWS Rekognition

Age 49-57
Gender Male, 98.4%
Surprised 40.2%
Sad 23.7%
Confused 20%
Happy 5.4%
Calm 5.3%
Disgusted 2.3%
Angry 2.1%
Fear 1%

AWS Rekognition

Age 43-51
Gender Male, 98.3%
Sad 80.1%
Happy 13.3%
Surprised 2.6%
Calm 1.3%
Confused 1.1%
Fear 0.6%
Angry 0.5%
Disgusted 0.4%

AWS Rekognition

Age 42-50
Gender Male, 97.7%
Happy 78.8%
Surprised 7.4%
Confused 5.1%
Angry 3%
Sad 2.3%
Calm 1.4%
Disgusted 1.1%
Fear 0.9%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.9%
Shoe 97.8%

Captions

Microsoft

a group of people posing for a picture 51.6%
a group of people standing in a room 51.5%
a group of people posing for the camera 47.9%

Text analysis

Google

MJI7--
MJI7-- YT3RA 2--NAGON
2--NAGON
YT3RA